To see the other types of publications on this topic, follow the link: Surface interpolation.

Dissertations / Theses on the topic 'Surface interpolation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Surface interpolation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Asaturyan, Souren. "Shape preserving surface interpolation schemes." Thesis, University of Dundee, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.278368.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bejancu, Aurelian. "Convergence properties of surface spline interpolation." Thesis, University of Cambridge, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.621711.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Leung, Nim Keung. "Convexity-Preserving Scattered Data Interpolation." Thesis, University of North Texas, 1995. https://digital.library.unt.edu/ark:/67531/metadc277609/.

Full text
Abstract:
Surface fitting methods play an important role in many scientific fields as well as in computer aided geometric design. The problem treated here is that of constructing a smooth surface that interpolates data values associated with scattered nodes in the plane. The data is said to be convex if there exists a convex interpolant. The problem of convexity-preserving interpolation is to determine if the data is convex, and construct a convex interpolant if it exists.
APA, Harvard, Vancouver, ISO, and other styles
4

Al-Tahir, Raid A. "Interpolation and analysis in hierarchical surface reconstruction /." The Ohio State University, 1995. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487862972135505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Dickens, Nicholas A. "Smooth curve interpolation and surface construction in CAD." Thesis, Loughborough University, 1986. https://dspace.lboro.ac.uk/2134/32397.

Full text
Abstract:
The widespread adoption of CAD in recent years has highlighted problems in both curve and surface representation. There is a need for algorithms to reduce the amount of data input, and for fuller understanding of the underlying mathematics. This thesis falls naturally into two parts; dealing respectively with automatic smooth curve interpolation and surface construction.
APA, Harvard, Vancouver, ISO, and other styles
6

Bergsjö, Joline. "Photogrammetric point cloud generation and surface interpolation for change detection." Thesis, KTH, Geodesi och satellitpositionering, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-190882.

Full text
Abstract:
In recent years the science revolving image matching algorithms has gotten an upswing mostly due to its benefits in computer vision. This has led to new opportunities for photogrammetric methods to compete with LiDAR data when it comes to 3D-point clouds and generating surface models. In Sweden a project to create a high resolution national height model started in 2009 and today almost the entirety of Sweden has been scanned with LiDAR sensors. The objective for this project is to achieve a height model with high spatial resolution and high accuracy in height. As for today no update of this model is planned in the project so it’s up to each municipality or company who needs a recent height model to update themselves. This thesis aims to investigate the benefits and shortcomings of using photogrammetric measures for generating and updating surface models. Two image matching software are used, ERDAS photogrammetry and Spacemetric Keystone, to generate a 3D point cloud of a rural area in Botkyrka municipality. The point clouds are interpolated into surface models using different interpolation percentiles and different resolutions. The photogrammetric point clouds are evaluated on how well they fit a reference point cloud, the surfaces are evaluated on how they are affected by the different interpolation percentiles and image resolutions. An analysis to see if the accuracy improves when the point cloud is interpolated into a surface. The result shows that photogrammetric point clouds follows the profile of the ground well but contains a lot of noise in the forest covered areas. A lower image resolution improves the accuracy for the forest feature in the surfaces. The results also show that noise-reduction is essential to generate a surface with decent accuracy. Furthermore, the results identify problem areas in dry deciduous forest where the photogrammetric method fails to capture the forest.
APA, Harvard, Vancouver, ISO, and other styles
7

Flanagin, Maik. "The Hydraulic Spline: Comparisons of Existing Surface Modeling Techniques and Development of a Spline-Based Approach for Hydrographic and Topographic Surface Modeling." ScholarWorks@UNO, 2007. http://scholarworks.uno.edu/td/613.

Full text
Abstract:
Creation of accurate and coherent surface models is vital to the effective planning and construction of flood control and hurricane protection projects. Typically, topographic surface models are synthesized from Delaunay triangulations or interpolated raster grids. Although these techniques are adequate in most general situations, they do not effectively address the specific case where topographic data is available only as cross-section and profile centerline data, such as the elevation sampling produced by traditional hydrographic surveys. The hydraulic spline algorithm was developed to generate irregular two-dimensional channel grids from hydrographic cross-sections at any desired resolution. Hydraulic spline output grids can be easily merged with datasets of higher resolution, such as LIDAR data, to build a complete model of channel geometry and overbank topography. In testing, the hydraulic spline algorithm faithfully reproduces elevations of known input cross-section points where they exist, while generating a smooth transition between known cross-sections. The algorithm performs particularly well compared to traditional techniques with respect to aesthetics and accuracy when input data is sparse. These qualities make the hydraulic spline an ideal choice for practical applications where available data may be limited due to historic or budgetary reasons.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhu, Lei. "On Visualizing Branched Surface: an Angle/Area Preserving Approach." Diss., Available online, Georgia Institute of Technology, 2004, 2004. http://etd.gatech.edu/theses/available/etd-09142004-114941/.

Full text
Abstract:
Thesis (Ph. D.)--Biomedical Engineering, Georgia Institute of Technology, 2006.
Anthony J. Yezzi, Committee Member ; James Gruden, Committee Member ; Allen Tannenbaum, Committee Chair ; May D. Wang, Committee Member ; Oskar Skrinjar, Committee Member. Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
9

Lai, Shuhua. "Subdivision Surface based One-Piece Representation." UKnowledge, 2006. http://uknowledge.uky.edu/gradschool_diss/330.

Full text
Abstract:
Subdivision surfaces are capable of modeling and representing complex shapes of arbi-trary topology. However, methods on how to build the control mesh of a complex surfaceare not studied much. Currently, most meshes of complicated objects come from trian-gulation and simplification of raster scanned data points, like the Stanford 3D ScanningRepository. This approach is costly and leads to very dense meshes.Subdivision surface based one-piece representation means to represent the final objectin a design process with only one subdivision surface, no matter how complicated theobject's topology or shape. Hence the number of parts in the final representation isalways one.In this dissertation we present necessary mathematical theories and geometric algo-rithms to support subdivision surface based one-piece representation. First, an explicitparametrization method is presented for exact evaluation of Catmull-Clark subdivisionsurfaces. Based on it, two approaches are proposed for constructing the one-piece rep-resentation of a given object with arbitrary topology. One approach is to construct theone-piece representation by using the interpolation technique. Interpolation is a naturalway to build models, but the fairness of the interpolating surface is a big concern inprevious methods. With similarity based interpolation technique, we can obtain bet-ter modeling results with less undesired artifacts and undulations. Another approachis through performing Boolean operations. Up to this point, accurate Boolean oper-ations over subdivision surfaces are not approached yet in the literature. We presenta robust and error controllable Boolean operation method which results in a one-piecerepresentation. Because one-piece representations resulting from the above two methodsare usually dense, error controllable simplification of one-piece representations is needed.Two methods are presented for this purpose: adaptive tessellation and multiresolutionanalysis. Both methods can significantly reduce the complexity of a one-piece represen-tation and while having accurate error estimation.A system that performs subdivision surface based one-piece representation was im-plemented and a lot of examples have been tested. All the examples show that our ap-proaches can obtain very good subdivision based one-piece representation results. Eventhough our methods are based on Catmull-Clark subdivision scheme, we believe they canbe adapted to other subdivision schemes as well with small modifications.
APA, Harvard, Vancouver, ISO, and other styles
10

Langton, Michael Keith. "Radial Basis Functions Applied to Integral Interpolation, Piecewise Surface Reconstruction and Animation Control." Thesis, University of Canterbury. Mathematics and Statistics, 2009. http://hdl.handle.net/10092/4078.

Full text
Abstract:
This thesis describes theory and algorithms for use with Radial Basis Functions (RBFs), emphasising techniques motivated by three particular application areas. In Part I, we apply RBFs to the problem of interpolating to integral data. While the potential of using RBFs for this purpose has been established in an abstract theoretical context, their use has been lacking an easy to check sufficient condition for finding appropriate parent basic functions, and explicit methods for deriving integral basic functions from them. We present both these components here, as well as explicit formulations for line segments in two dimensions and balls in three and five dimensions. We also apply these results to real-world track data. In Part II, we apply Hermite and pointwise RBFs to the problem of surface reconstruction. RBFs are used for this purpose by representing the surface implicitly as the zero level set of a function in 3D space. We develop a multilevel piecewise technique based on scattered spherical subdomains, which requires the creation of algorithms for constructing sphere coverings with desirable properties and for blending smoothly between levels. The surface reconstruction method we develop scales very well to large datasets and is very amenable to parallelisation, while retaining global-approximation-like features such as hole filling. Our serial implementation can build an implicit surface representation which interpolates at over 42 million points in around 45 minutes. In Part III, we apply RBFs to the problem of animation control in the area of motion synthesis---controlling an animated character whose motion is entirely the result of simulated physics. While the simulation is quite well understood, controlling the character by means of forces produced by virtual actuators or muscles remains a very difficult challenge. Here, we investigate the possibility of speeding up the optimisation process underlying most animation control methods by approximating the physics simulator with RBFs.
APA, Harvard, Vancouver, ISO, and other styles
11

Detweiler, Zachary Ray. "Techniques for using 3D Terrain Surface Measurements for Vehicular Simulations." Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/32089.

Full text
Abstract:
Throughout a ground vehicle development program, it is necessary to possess the loads the vehicle will experience. Unfortunately, actual loads are only available at the conclusion of the program, when the vehicle has been built and design changes are costly. The design engineer is challenged with using predicted loads early in the design process, when changes are relatively easy and inexpensive to make. It is advantageous, therefore, to accurately predict these loads early in the program, thus improving the vehicle design and, ultimately, saving time and money. The prediction of these loads depends on the fidelity of the vehicle models and their excitation. The focus of this thesis is the development of techniques for using 3D terrain surface measurements for vehicular simulations. Contributions are made to vehicle model parameter identification, terrain filtering, and application-dependent interpolation methods for 3D terrain surfaces.

Modeling and simulation are used to improve and shorten a vehicleâ s development cycle, thus, saving time and money. An important aspect in developing a vehicle model is to identify the parameters. Some parameters are easily measured with readily available tools; however, other parameters require dismantling the vehicle or using expensive test equipment. Initial estimates of these difficult or costly to obtain parameters are made based on similar vehicle models or standard practices. In this work, a parameter identification method is presented to obtain a better estimate of these inaccessible parameters using measured terrain excitations. By knowing the excitations to the physical vehicle, the simulated response can be compared to measured response, and then the vehicle modelâ s parameters can be optimized such that the error between the responses is minimized. Through this process, better estimates of the vehicleâ s parameter are obtained, which demonstrates that measured terrain can improve vehicle development by increasing the accuracy of parameter estimates.

The principal excitation to any ground vehicle is the terrain, and by obtaining more accurate representations of the terrain, vehicular simulation techniques are advanced. Many simple vehicle models use a point contact tire model, which performs poorly when short wavelength irregularities are present because the model neglects the tireâ s mechanical filtering properties. Therefore, a filter is used to emulate a tireâ s mechanical filtering mechanism and create an effective terrain profile. In this work, terrain filters are evaluated to quantify their effect on the sprung mass response of the dynamic simulation of a seven degree of freedom vehicle model.

In any vehicular simulation, there is a balance between analytical expense and simulation realism. This balance often limits simulations to 2D terrain profile excitations, but as computing power increases the computational expense decreases. Thus, 3D terrain excitations for vehicular simulation are a tool for advancing simulation realism that is becoming less computationally expensive. Three dimensional terrain surfaces are measured with a non-uniform spacing in the horizontal plane; therefore, application-dependent gridding methods are developed in this work to interpolate 3D terrain surface to uniform grid spacing. The uniform grid spacing allows 3D terrain surfaces to be used more efficiently in any vehicular simulation when compared to non-uniform spacing.
Master of Science

APA, Harvard, Vancouver, ISO, and other styles
12

Jin, Menglin. "Interpolation of surface radiative temperature measured from polar orbiting satellites to a diurnal cycle." Diss., The University of Arizona, 1999. http://hdl.handle.net/10150/282883.

Full text
Abstract:
The land surface skin temperature diurnal cycle (LSTD) is very important for the understanding of surface climate and for evaluating climate models. This variable, however, cannot be obtained globally from polar-orbiting satellites because the satellites usually pass a given area twice per day and because their infrared channels cannot observe the surface when the sky is cloudy. In order to more optimally use the satellite data, this research is designed, for the first time, to solve the above two problems by advance use of remote sensing techniques and climate modeling. Specifically, this work is divided into two parts. Part one deals with obtaining the skin temperature diurnal cycle for cloud-free cases. We have developed a "cloud-free algorithm" to combine model results with satellite and surface-based observations, thus interpolating satellite twice-daily observations to the diurnal cycle. Part two studies the cloudy cases. The "cloudy-pixel treatment" presented here is a hybrid technique of "neighboring-pixel" and "surface air temperature" approaches. The whole algorithm has been tested against field experiments and climate model CCM3/BATS in global and single column mode simulations. It shows that this proposed algorithm can obtain skin temperature diurnal cycles with an accuracy of 1-2 K at the monthly pixel level.
APA, Harvard, Vancouver, ISO, and other styles
13

Van, den Bergh F., Wyk MA Van, Wyk BJ Van, and G. Udahemuka. "A comparison of data-driven and model-driven approaches to brightness temperature diurnal cycle interpolation." SAIEE Africa Research Journal, 2007. http://encore.tut.ac.za/iii/cpro/DigitalItemViewPage.external?sp=1001082.

Full text
Abstract:
This paper presents two new schemes for interpolating missing samples in satellite diurnal temperature cycles (DTCs). The first scheme, referred to here as the cosine model, is an improvement of the model proposed in [2] and combines a cosine and exponential function for modelling the DTC. The second scheme uses the notion of a Reproducing Kernel Hilbert Space (RKHS) interpolator [1] for interpolating the missing samples. The application of RKHS interpolators to the DTC interpolation problem is novel. Results obtained by means of computer experiments are presented.
APA, Harvard, Vancouver, ISO, and other styles
14

Wang, Jiaxi. "PARAMETRIZATION AND SHAPE RECONSTRUCTION TECHNIQUES FOR DOO-SABIN SUBDIVISION SURFACES." UKnowledge, 2008. http://uknowledge.uky.edu/gradschool_theses/509.

Full text
Abstract:
This thesis presents a new technique for the reconstruction of a smooth surface from a set of 3D data points. The reconstructed surface is represented by an everywhere -continuous subdivision surface which interpolates all the given data points. And the topological structure of the reconstructed surface is exactly the same as that of the data points. The new technique consists of two major steps. First, use an efficient surface reconstruction method to produce a polyhedral approximation to the given data points. Second, construct a Doo-Sabin subdivision surface that smoothly passes through all the data points in the given data set. A new technique is presented for the second step in this thesis. The new technique iteratively modifies the vertices of the polyhedral approximation 1CM until a new control meshM, whose Doo-Sabin subdivision surface interpolatesM, is reached. It is proved that, for any mesh M with any size and any topology, the iterative process is always convergent with Doo-Sabin subdivision scheme. The new technique has the advantages of both a local method and a global method, and the surface reconstruction process can reproduce special features such as edges and corners faithfully.
APA, Harvard, Vancouver, ISO, and other styles
15

Morel, Jules. "Surface reconstruction based on forest terrestrial LiDAR data." Thesis, Aix-Marseille, 2017. http://www.theses.fr/2017AIXM0039/document.

Full text
Abstract:
Au cours des dernières années, la capacité de la technologie LiDAR à capturer des informations détaillées sur la structure des forêts a attiré une attention croissante de la part de la communauté des écologues et des forestiers. Le LiDAR terrestre, notamment, apparaît comme un outil prometteur pour recueillir les caractéristiques géométriques des arbres à une précision millimétrique.Cette thèse étudie la reconstruction de surface à partir de nuages de points épars et non structurés, capturés en environnement forestier par un LiDAR terrestre. Nous proposons une suite d’algorithmes dédiés à la reconstruction de modèles d’attributs de placettes forestières : le sol etla structure ligneuse des arbres (i.e. troncs et branches principales). En pratique, nos approches modélisent le problème par des surfaces implicites construites à partir de fonctions à base radiale pour faire face à la forte hétérogénéité spatiale du nuage de points Lidar terrestre
In recent years, the capacity of LiDAR technology to capture detailed information about forests structure has attracted increasing attention in the field of forest science. In particular, the terrestrial LiDAR arises as a promising tool to retrieve geometrical characteristics of trees at a millimeter level.This thesis studies the surface reconstruction problem from scattered and unorganized point clouds, captured in forested environment by a terrestrial LiDAR. We propose a sequence of algorithms dedicated to the reconstruction of forests plot attributes model: the ground and the woody structure of trees (i.e. the trunk and the main branches). In practice, our approaches model the surface with implicit function build with radial basis functions to manage the homogeneity and handle the noise of the sample data points
APA, Harvard, Vancouver, ISO, and other styles
16

Friedrich, Tobias [Verfasser]. "Dynamical interpolation of surface pCO2 between lines of observation in the North Atlantic Ocean / Tobias Friedrich." Kiel : Universitätsbibliothek Kiel, 2010. http://d-nb.info/1019951397/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Rosenberg, Charles Joseph. "A lossy image compression algorithm based on nonuniform sampling and interpolation of the image intensity surface." Thesis, Massachusetts Institute of Technology, 1990. http://hdl.handle.net/1721.1/13576.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1990.
Title as it appears in the Sept. 1990 M.I.T. Graduate List: A lossy image compression based on nonuniform sampling and interpolation of the image intensity surface.
Includes bibliographical references (leaves 104-106).
by Charles Joseph Rosenberg.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
18

Antonelli, Michele. "New strategies for curve and arbitrary-topology surface constructions for design." Doctoral thesis, Università degli studi di Padova, 2015. http://hdl.handle.net/11577/3423911.

Full text
Abstract:
This dissertation presents some novel constructions for curves and surfaces with arbitrary topology in the context of geometric modeling. In particular, it deals mainly with three intimately connected topics that are of interest in both theoretical and applied research: subdivision surfaces, non-uniform local interpolation (in both univariate and bivariate cases), and spaces of generalized splines. Specifically, we describe a strategy for the integration of subdivision surfaces in computer-aided design systems and provide examples to show the effectiveness of its implementation. Moreover, we present a construction of locally supported, non-uniform, piecewise polynomial univariate interpolants of minimum degree with respect to other prescribed design parameters (such as support width, order of continuity and order of approximation). Still in the setting of non-uniform local interpolation, but in the case of surfaces, we devise a novel parameterization strategy that, together with a suitable patching technique, allows us to define composite surfaces that interpolate given arbitrary-topology meshes or curve networks and satisfy both requirements of regularity and aesthetic shape quality usually needed in the CAD modeling framework. Finally, in the context of generalized splines, we propose an approach for the construction of the optimal normalized totally positive (B-spline) basis, acknowledged as the best basis of representation for design purposes, as well as a numerical procedure for checking the existence of such a basis in a given generalized spline space. All the constructions presented here have been devised keeping in mind also the importance of application and implementation, and of the related requirements that numerical procedures must satisfy, in particular in the CAD context.
Questa tesi presenta alcune nuove costruzioni per curve e superfici a topologia arbitraria nel contesto della modellazione geometrica. In particolare, riguarda principalmente tre argomenti strettamente collegati tra loro che sono di interesse sia nella ricerca teorica sia in quella applicata: le superfici di suddivisione, l'interpolazione locale non-uniforme (nei casi univariato e bivariato), e gli spazi di spline generalizzate. Nello specifico, descriviamo una strategia per l'integrazione di superfici di suddivisione in sistemi di progettazione assistita dal calcolatore e forniamo degli esempi per mostrare l'efficacia della sua implementazione. Inoltre, presentiamo un metodo per la costruzione di interpolanti univariati polinomiali a tratti, non-uniformi, a supporto locale e che hanno grado minimo rispetto agli altri parametri di progettazione prescritti (come l'ampiezza del supporto, l'ordine di continuità e l'ordine di approssimazione). Sempre nel contesto dell'interpolazione locale non-uniforme, ma nel caso di superfici, introduciamo una nuova strategia di parametrizzazione che, insieme a una opportuna tecnica di patching, ci permette di definire superfici composite che interpolano mesh o network di curve a topologia arbitraria e che soddisfano i requisiti di regolarità e di qualità estetica di forma solitamente richiesti nell'ambito della modellazione CAD. Infine, nel contesto delle spline generalizzate, proponiamo un approccio per la costruzione della base (B-spline) ottimale, normalizzata, totalmente positiva, riconosciuta come la miglior base di rappresentazione ai fini della progettazione. In aggiunta, forniamo una procedura numerica per controllare l'esistenza di una tale base in un dato spazio di spline generalizzate. Tutte le costruzioni qui presentate sono state ideate tenendo in considerazione anche l'importanza delle applicazioni e dell'implementazione, e dei relativi requisiti che le procedure numeriche devono soddisfare, in particolare nel contesto CAD.
APA, Harvard, Vancouver, ISO, and other styles
19

You, Soyoung Ph D. Massachusetts Institute of Technology. "Finite element solution of interface and free surface three-dimensional fluid flow problems using flow-condition-based interpolation." Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/97845.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Mechanical Engineering, 2015.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 103-106).
The necessity for a highly accurate simulation scheme of free surface flows is emphasized in various industrial and scientific applications. To obtain an accurate response prediction, mass conservation must be satisfied. Due to a continuously moving fluid domain, however, it is a challenge to maintain the volume of the fluid while calculating the dynamic responses of free surfaces, especially when seeking solutions for long time durations. This thesis describes how the difficulty can be overcome by proper employment of an Arbitrary Lagrangian Eulerian (ALE) method derived from the Reynolds transport theorem to compute unsteady Newtonian flows including fluid interfaces and free surfaces. The proposed method conserves mass very accurately and obtains stable and accurate results with very large solution steps and even coarse meshes. The continuum mechanics equations are formulated, and the Navier-Stokes equations are solved using a 'flow-condition-based interpolation' (FCBI) scheme. The FCBI method uses exponential interpolations derived from the analytical solution of the 1-dimensional advection-diffusion equation. The thesis revisits the 2-dimensional FCBI method with special focus on the application to flow problems in highly nonlinear moving domains with interfaces and free surfaces, and develops an effective 3-D FCBI tetrahedral element for such applications. The newly developed 3-D FCBI solution scheme can solve flow problems of a wide range since it can handle highly nonlinear and unsteady flow conditions, even when large mesh distortions occur. Various example solutions are given to show the effectiveness of the developed solution schemes.
by Soyoung You.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
20

Kiichenko, Vladyslav Yuriiovych. "Use of surface interpolation methods for determination of dioxide hydroxide in the air of the city of Kyiv." Thesis, National Aviation University, 2021. https://er.nau.edu.ua/handle/NAU/50612.

Full text
Abstract:
1. Jalalvand A., Gaidukova EV, Burlov VG, Akhondali AM application of spatial interpolation methods. International Research Journal. Sector "technical sciences". - Issue No2 (80), 2019; 2. How the Kriging tool works — Help. - Access mode: http://desktop.arcgis.com/ru/arcmap/10.3/tools/3d-analyst-toolbox/how-kriging-works.htm; 3. Problems of data protection in information data. - Access mode: https://studfiles.net/preview/6440954/page:39/#77; 4. Spline help | ArcGIS for Desktop. - Access mode: http://desktop.arcgis.com/ru/arcmap/10.3/tools/spatial-analyst-toolbox/how-spline-works.htm; 5. How the Slope - Help tool works ArcGIS. - http://desktop.arcgis.com/ru/arcmap /10.3/tools/3d-analyst-toolbox/how-idw-works.htm. 6. Sibson, R., 1981. A brief description of natural neighbour interpolation. Interpreting multivariate data.
In geographic information systems, interpolation of surfaces by various methods is often used. Topics in this area are relevant today and promising for further study and practical research in the field of geoinformation using GIS technologies. The purpose of interpolation in GIS is to fill in the gaps between known measurement points and thus simulate a continuous distribution of a property (attribute). Interpolation is based on the assumption that spatially distributed objects are correlated in space, that is, adjacent objects have similar characteristics. Spatial interpolation of point data is based on the choice of analytical surface model.
В геоінформаційних системах часто використовується інтерполяція поверхонь різними методами. Теми в цій галузі є актуальними сьогодні та перспективними для подальшого вивчення та практичних досліджень у галузі обробки геоінформації із використанням ГІС-технологій. Метою інтерполяції в ГІС є заповнення прогалин між відомими точками вимірювання і, таким чином, моделювання безперервного розподілу властивості (атрибута). Інтерполяція базується на припущенні, що просторово розподілені об'єкти співвідносяться в просторі, тобто сусідні об'єкти мають подібні характеристики. Просторова інтерполяція точкових даних базується на виборі аналітичної моделі поверхні.
APA, Harvard, Vancouver, ISO, and other styles
21

Moussa, Hadjer. "Traitement automatique de données océanographiques pour l'interpolation de la ∫CO₂ de surface dans l'océan Atlantique tropical, en utilisant les données satellitaires." Thesis, Perpignan, 2016. http://www.theses.fr/2016PERP0025/document.

Full text
Abstract:
Ce travail de thèse consiste à utiliser les données satellitaires de SST (température de surface), SSS (salinité de surface), et Chl-a (chlorophylle-a), pour l’interpolation de la fugacité du CO2 (fCO2) dans la couche de surface de l’océan Atlantique tropical, pour les saisons de la période 2002-2013. Trois types de données ont été utilisés : in situ (BD (base de données) SOCAT V.3) ; satellitaires (capteurs : MODIS-A, Sea-WIFS, et SMOS) ; et assimilées (BD SODA V.2.2.4). La première étape était la classification des données en se basant sur la SST. La deuxième étape était l’interpolation de la fCO2 (pour chaque classe de chaque saison), en utilisant des RNs (réseaux de neurones artificiels) de type feedforward, avec un apprentissage de type backpropagation. Les résultats obtenus (RMSEs (root mean square error) variant de 8,8 à 15,7 µatm) permettent de confirmer l’importance de : traiter les saisons séparément, classifier les données, et choisir le meilleur RN en fonction des résultats de la généralisation. Ceci a permis l’élaboration de 138 fichiers CSV (Comma-separated values) de fCO2 mensuelle, avec une résolution de 4 km x 4 km, pour la période allant de juillet 2002 à décembre 2013
This thesis work consists of using satellite data of SST (sea surface temperature), SSS (sea surface salinity), and Chl-a (chlorophyll-a), in order to interpolate the CO2 fugacity (fCO2) in the surface of the tropical Atlantic ocean, for seasons of the period 2002-2013. Three data types were used: in situ (SOCAT V.3 DB (database)); satellite (MODIS-A, Sea-WIFS, and SMOS sensors); and assimilated (SODA V.2.2.4 DB). The first step was the data classification based on SST. The second step was the fCO2 interpolation (for each class of each season), using feedforward NNs (artificial neural networks) with a backpropagation learning method. Obtained results (RMSEs (root mean square error) between 8,8 and 15,7 µatm) confirm the importance of: process each season separately, pass through data classification step, and choose the best NN on the basis of generalization step results. This allowed the development of 138 monthly fCO2 CSV (Comma-separated values) file, with 4 km x 4 km spatial resolution, for the period from July 2002 to December 2013
APA, Harvard, Vancouver, ISO, and other styles
22

Qu, Ruibin. "Recursive subdivision algorithms for curve and surface design." Thesis, Brunel University, 1990. http://bura.brunel.ac.uk/handle/2438/5447.

Full text
Abstract:
In this thesis, the author studies recursIve subdivision algorithms for curves and surfaces. Several subdivision algorithms are constructed and investigated. Some graphic examples are also presented. Inspired by the Chaikin's algorithm and the Catmull-Clark's algorithm, some non-uniform schemes, the non-uniform corner cutting scheme and the recursive subdivision algorithm for non-uniform B-spline curves, are constructed and analysed. The adapted parametrization is introduced to analyse these non-uniform algorithms. In order to solve the surface interpolation problem, the Dyn-Gregory-Levin's 4-point interpolatory scheme is generalized to surfaces and the 10-point interpolatory subdivision scheme for surfaces is formulated. The so-called Butterfly Scheme, which was firstly introduced by Dyn, Gregory Levin in 1988, is just a special case of the scheme. By studying the Cross-Differences of Directional Divided Differences, a matrix approach for analysing uniform subdivision algorithms for surfaces is established and the convergence of the 10-point scheme over both uniform and non-uniform triangular networks is studied. Another algorithm, the subdivision algorithm for uniform bi-quartic B-spline surfaces over arbitrary topology is introduced and investigated. This algorithm is a generalization of Doo-Sabin's and Catmull-Clark's algorithms. It produces uniform Bi-quartic B-spline patches over uniform data. By studying the local subdivision matrix, which is a circulant, the tangent plane and curvature properties of the limit surfaces at the so-called Extraordinary Points are studied in detail.
APA, Harvard, Vancouver, ISO, and other styles
23

Lopez, Radcenco Manuel. "Data-driven approaches for ocean remote sensing : from the non-negative decomposition of operators to the reconstruction of satellite-derived sea surface dynamics." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2018. http://www.theses.fr/2018IMTA0107/document.

Full text
Abstract:
Au cours des dernières années, la disponibilité toujours croissante de données de télédétection multi-source de l'océan a été un facteur clé pour améliorer notre compréhension des dynamiques de la surface de l'océan. A cet égard, il est essentiel de mettre au point des approches efficaces pour exploiter ces ensembles de données. En particulier, la décomposition des processus géophysiques en modes pertinents est une question clé pour les problèmes de caractérisation, de prédiction et de reconstruction. Inspirés par des progrès récents en séparation aveugle des sources, nous visons, dans la première partie de cette thèse, à étendre les modèles de séparation aveugle de sources sous contraintes de non-négativité au problème de la caractérisation et décomposition d'opérateurs ou fonctions de transfert entre variables d'intérêt. Nous développons des schémas computationnels efficaces reposant sur des fondations mathématiques solides. Nous illustrons la pertinence des modèles de décomposition proposés dans différentes applications impliquant l'analyse et la prédiction de dynamiques géophysiques. Par la suite, étant donné que la disponibilité toujours croissante d'ensembles de données multi-sources supporte l'exploration des approches pilotées par les données en tant qu'alternative aux formulations classiques basées sur des modèles, nous explorons des approches basées sur les données récemment introduits pour l'interpolation des champs géophysiques à partir d'observations satellitaires irrégulièrement échantillonnées. De plus, en vue de la future mission SWOT, la première mission satellitaire à produire des observations d'altimétrie par satellite complètement bidimensionnelles et à large fauchée, nous nous intéressons à évaluer dans quelle mesure les données SWOT permettraient une meilleure reconstruction des champs altimétriques
In the last few decades, the ever-growing availability of multi-source ocean remote sensing data has been a key factor for improving our understanding of upper ocean dynamics. In this regard, developing efficient approaches to exploit these datasets is of major importance. Particularly, the decomposition of geophysical processes into relevant modes is a key issue for characterization, forecasting and reconstruction problems. Inspired by recent advances in blind source separation, we aim, in the first part of this thesis dissertation, at extending non-negative blind source separation models to the problem of the observation-based characterization and decomposition of linear operators or transfer functions between variables of interest. We develop mathematically sound and computationally efficient schemes. We illustrate the relevance of the proposed decomposition models in different applications involving the analysis and forecasting of geophysical dynamics. Subsequently, given that the ever-increasing availability of multi-source datasets supports the exploration of data-driven alternatives to classical model-driven formulations, we explore recently introduced data-driven models for the interpolation of geophysical fields from irregularly sampled satellite-derived observations. Importantly, with a view towards the future SWOT mission, the first satellite mission to produce complete two-dimensional wide-swath satellite altimetry observations, we focus on assessing the extent to which SWOT data may lead to an improved reconstruction of altimetry fields
APA, Harvard, Vancouver, ISO, and other styles
24

Cognault, Aurore. "Caractérisation de SER Basse Fréquence et Modes Caractéristiques." Phd thesis, Ecole Centrale Paris, 2009. http://tel.archives-ouvertes.fr/tel-00453298.

Full text
Abstract:
La SER, est la grandeur qui permet de quantifier le pouvoir réflecteur d'un objet, ou a contrario sa discrétion électromagnétique. Maîtriser la SER, voire la diminuer, est un enjeu majeur dans le domaine aéronautique de défense. C'est en particulier un gage de survivabilité pour les aéronefs. Historiquement, les fréquences RADAR d'intérêt étaient celles de la bande Super Haute Fréquence, ce qui équivaut à des longueurs d'onde de 2 à 30 centimètres. Des outils d'analyse adaptés ainsi que des moyens de mesure ou de caractérisation de la SER ont été mis au point. Ils se sont révélés extrêmement performants. On peut citer par exemple la chambre anéchoïque CAMELIA du CESTA. En revanche, dans le domaine des basses fréquences, il est plus délicat de réaliser des mesures précises. Pour des longueurs d'onde de 1 à 5 mètres, l'épaisseur des absorbants est souvent trop faible ; même les dimensions des chambres anéchoïques ne représentent que quelques longueurs d'onde. Notre objectif, lors de cette thèse, était de proposer et d'étudier des algorithmes nouveaux permettant d'améliorer ou de faciliter la caractérisation de la SER en basse fréquence. La notion de courants caractéristiques, introduite par Harrington et Mautz dans les années 70, puis reprise par Y. Morel dans le cas d'objets parfaitement conducteurs, permet la décomposition d'un courant induit quelconque en courants élémentaires. Les modes caractéristiques sont obtenus en faisant rayonner ces courants caractéristiques. Cependant, il n'existe pas d'outil de détermination des modes lorsque l'objet n'est plus parfaitement conducteur. Nous nous sommes donc dotés d'un tel outil, que nous avons construit et validé. Pour cela, nous avons repris dans un premier temps le cadre mathématique qui permet de définir l'opérateur de Perturbation, ses propriétés mathématiques et sa décomposition en éléments propres. Nous avons montré que cet opérateur discrétisé conserve ses propriétés mathématiques. Nous avons ensuite validé notre méthode de calcul direct des modes caractéristiques, issus de la diagonalisation de l'opérateur de perturbation discrétisé. Dans un deuxième temps, nous avons mené des études phénoménologiques. Nous avons tout d'abord observé l'évolution des éléments propres de l'opérateur de perturbation en fonction de l'impédance, et nous nous sommes intéressés au cas particulier de l'impédance égale à 1. Nous avons ensuite observé les phénomènes lorsque la fréquence évolue. En nous concentrant sur les valeurs propres, nous avons pu différencier deux types de modes. Enfin, nous avons détaillé quelques exemples d'applications concrètes de cette méthode de détermination des modes, qui permettent d'améliorer ou de faciliter la caractérisation de la SER en basse fréquence. L'outil ORFE (Outil de Reformulation, Filtrage et Extrapolation de données) permet d'atténuer les termes d'erreurs inhérents à toute caractérisation, et d'extrapoler des données existantes à des cas de figure non acquis ou non accessibles en mesure. Il a donné lieu à un brevet. Un outil d'interpolation de SER en basse fréquence a aussi été construit. Il permet d'obtenir de meilleurs résultats que l'interpolation linéaire de la SER. Nous avons aussi mis en place une méthode d'imagerie basse fréquence. Elle permet de localiser d'éventuels défauts de métallisation de l'objet considéré, en utilisant la base des courants caractéristiques. Enfin, nous avons présenté une méthodologie de caractérisation de SER qui intègre les limites des moyens de mesure. Nous avons mis en évidence que cette caractérisation donne une information absolue sur la SER de l'objet, dans un périmètre de validité. Un brevet a été déposé sur cette méthode.
APA, Harvard, Vancouver, ISO, and other styles
25

Rizo, Steven R. "Quantifying the Effect of Topographic Slope on Lava Flow Thickness: A First Step to Improve Lava Flow Volume Estimation Methods." Scholar Commons, 2018. http://scholarcommons.usf.edu/etd/7222.

Full text
Abstract:
The volume of lava flows provide important information on the magnitude of volcanic eruptions, and accurate volumes are necessary to produce reliable models of lava flow emplacement or constrain the internal structure of volcanoes. The most accurate lava flow volumes are obtainable when the topography before and after an eruption are both known, but information for the topography before lava flow emplacement is absent in non-historic lava flows. To calculate the volume of non-historic lava flows, this pre-emplacement topography needs to be reconstructed. Common methods for this include using inverse distance-weighted averages or global polynomial interpolation methods, but these can still underestimate the volume of the flow, and the surface of the flow itself is not considered in these interpolations. A new calculation method seems necessary to better constrain the volume of lava flows, and including the lava flow surface in the volume calculation, given that it is generally excluded during interpolation of pre-emplacement topography, may be the solution to improving lava flow volume calculation for flows where the base surface is unknown. The 2012-2013 Tolbachik lava flow is used to look at potential relationships due to the availability of elevation data before and after the eruption. A quantitative analysis on the relationships between the slope of topography before and after lava flow emplacement and on the relationship between the slope and thickness of lava flows is performed. In addition to this, the slope of the topography calculated over local and regional scales is used as a new interpolation method, and the calculated thickness from the interpolated surface is compared to the known thickness for the lava flow.
APA, Harvard, Vancouver, ISO, and other styles
26

Collier, Nathaniel O. "The Quasi-Uniformity Condition and Three-Dimensional Geometry Representation as it Applies to the Reproducing Kernel Element Method." Scholar Commons, 2009. https://scholarcommons.usf.edu/etd/1904.

Full text
Abstract:
The Reproducing Kernel Element Method (RKEM) is a hybrid between finite elements and meshfree methods that provides shape functions of arbitrary order and continuity yet retains the Kronecker-delta property. To achieve these properties, the underlying mesh must meet certain regularity constraints, unique to RKEM. The aim of this dissertation is to develop a precise definition of these constraints, and a general algorithm for assessing a mesh is developed. This check is a critical step in the use of RKEM in any application. The general checking algorithm is made more specific to apply to two-dimensional triangular meshes with circular supports and to three-dimensional tetrahedral meshes with spherical supports. The checking algorithm features the output of the uncovered regions that are used to develop a mesh-mending technique for fixing offending meshes. The specific check is used in conjunction with standard quality meshing techniques to produce meshes suitable for use with RKEM. The RKEM quasi-uniformity definitions enable the use of RKEM in solving Galerkin weak forms as well as in general interpolation applications, such as the representation of geometries. A procedure for determining a RKEM representation of discrete point sets is presented with results for surfaces in three-dimensions. This capability is important to the analysis of geometries such as patient-specific organs or other biological objects.
APA, Harvard, Vancouver, ISO, and other styles
27

Taleb, Riadh. "Design géométrique de surfaces de topologie arbitraire." Phd thesis, Université Joseph Fourier (Grenoble), 2001. http://tel.archives-ouvertes.fr/tel-00004706.

Full text
Abstract:
Cette thèse est consacrée à la définition d'une surface géométriquement lisse interpolant un ensemble triangulé de points de R^3. Une telle triangulation, que nous appelons "réseau surfacique", doit définir une sous-variété de dimension 2, et peut représenter des surfaces de n'importe quel genre topologique. Il fournit l'information topologique, par l'intermédiaire d'une structure de données contenant les informations d'adjacence entre les sommets, les arêtes et les faces. Nous avons développé deux méthodes pour l'interpolation des sommets du réseau surfacique. Elles sont strictement locales et produisent des surfaces polynomiales par morceaux de degré 5 et de continuité G^1. De nombreux paramètres libres sont disponibles et ajustés soit interactivement soit automatiquement afin de lisser la surface. Dans le contexte interactif, plusieurs outils de design sont développés, basés sur l'interprétation géométrique des paramètres libres. La forme voulue peut être obtenue par une modélisation temps réel, grâce à la localité des algorithmes. Dans le cas du design automatique, de nombreux algorithmes ont été developpés satisfaisant un certain nombre de caractéristiques de forme. Un grand nombre de règles heuristiques et d'optimisations locales sont utilisées pour définir les valeurs des paramètres de forme dans le but d'obtenir des formes satisfaisantes ainsi qu'un contrôle optimal de la surface.
APA, Harvard, Vancouver, ISO, and other styles
28

Huang, Shuai. "Efficient and Accurate Path Integral Simulations of Thermochemical Properties." Thesis, The University of Sydney, 2022. https://hdl.handle.net/2123/28051.

Full text
Abstract:
The accurate calculation of quantum thermochemical properties is a challenge for contemporary computational chemistry. Path integral Monte Carlo (PIMC) simulations provide an efficient and scalable method to estimate full-dimensional properties in large chemical systems. In this thesis, we investigate different implementations of PIMC simulations and combine them with the modified Shepard interpolation (MSI) of molecular potential energy surfaces (PESs). We demonstrate these methods can be made accurate and efficient for bound state molecular systems. Within the path integral formalism, we consider the primitive approximation of the action and three higher order approximations: Takahashi-Imada, Suzuki-Chin and Chin. The resulting PIMC methods are benchmarked using spectroscopically accurate PESs for water and HCN–HNC and parameters within the Suzuki-Chin and Chin approximations are numerically optimised. The Suzuki-Chin approximation is shown to provide the most computationally efficient PIMC simulations. The accuracy of the PIMC method relies on the quality of molecular PES. A review of the GROW procedure for molecular PESs, and its underlying MSI, is undertaken. Our PIMC code is incorporated within GROW software suite, allowing new PES configurations to be chosen from path integral sampling of configuration space at a given temperature. This allows PESs to be automatically and iteratively improved from a small set of initial molecular configurations. We apply the GROW procedure to methane, as a prototypical bound state system, using the numerically optimised Suzuki-Chin action, to determine thermochemical properties. These calculations allow efficient strategies for choosing initial configurations for bound state problems to be developed and the dependence of quantum thermochemical properties on electronic structure method and basis set to be determined.
APA, Harvard, Vancouver, ISO, and other styles
29

Oqielat, Moa'ath Nasser. "Modelling water droplet movement on a leaf surface." Thesis, Queensland University of Technology, 2009. https://eprints.qut.edu.au/30232/1/Moa%27ath_Oqielat_Thesis.pdf.

Full text
Abstract:
The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.
APA, Harvard, Vancouver, ISO, and other styles
30

Oqielat, Moa'ath Nasser. "Modelling water droplet movement on a leaf surface." Queensland University of Technology, 2009. http://eprints.qut.edu.au/30232/.

Full text
Abstract:
The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.
APA, Harvard, Vancouver, ISO, and other styles
31

Aït, Ettajer Taoufik. "Modélisation de surfaces géologiques complexes sous contraintes géométriques : application à la génération automatique de modèles géologiques." Vandoeuvre-les-Nancy, INPL, 1995. http://www.theses.fr/1995INPL058N.

Full text
Abstract:
La modélisation géométrique de surfaces géologiques est un domaine en plein essor dont l'objectif est de représenter, de façon précise, des surfaces géologiques complexes. Les approches basées sur les méthodes de C. A. O. Classiques n'étant pas suffisamment adaptées pour la modélisation de telles surfaces, le professeur Mallet a proposé une nouvelle approche de la modélisation des surfaces naturelles dans le cadre du projet GOCAD. Le noyau du projet réside dans un nouveau principe d'interpolation en 3 dimensions appelé D. S. I. (Discrete Smooth Interpolation). Les travaux présentés dans ce mémoire furent menés au sein de ce projet et consistaient à développer un ensemble d'outils géométriques pour la construction automatique de modèles géologiques. Parmi ces outils, on trouve le calcul des courbes d'isovaleurs sur des surfaces complexes et la modélisation de failles. La nouvelle approche, pour le calcul des courbes d'isovaleurs sur des surfaces, se distingue par sa rapidité et sa précision même si les surfaces sont complexes (par exemple présentant des trous). La modélisation d'une faille passe par la construction d'une surface et par la caractérisation de la relation qu'elle entretient avec un horizon. Cette dernière opération a été rendue possible par la mise en place de trois contraintes géométriques D. S. I. : OnTsurf, VecLink et OnBorder. Les outils de modélisation géométrique du projet GOCAD seront utilisés pour mettre au point un prototype de générateur automatique de modèles géologiques
APA, Harvard, Vancouver, ISO, and other styles
32

GILARDEAU, HELENE. "Modelisation surfacique d'objets tridimensionnels : application a la vision par ordinateur." Toulouse 3, 1989. http://www.theses.fr/1989TOU30017.

Full text
Abstract:
A l'aide d'un systeme original de vision 3d, fournissant un ensemble c::(c) de composantes connexes constituees de points communs par leurs coordonnees 3d-correspondant a la projection d'une grille (24 x 36) sur la scene, la modelisation d'une scene 3d sour forme d'un ensemble de surfaces regulieres est proposee
APA, Harvard, Vancouver, ISO, and other styles
33

Huang, Conglin. "3D RECONSTRUCTION USING MULTI-VIEW IMAGING SYSTEM." UKnowledge, 2009. http://uknowledge.uky.edu/gradschool_theses/600.

Full text
Abstract:
This thesis presents a new system that reconstructs the 3D representation of dental casts. To maintain the integrity of the 3D representation, a standard model is built to cover the blind spots that the camera cannot reach. The standard model is obtained by scanning a real human mouth model with a laser scanner. Then the model is simplified by an algorithm which is based on iterative contraction of vertex pairs. The simplified standard model uses a local parametrization method to obtain the curvature information. The system uses a digital camera and a square tube mirror in front of the camera to capture multi-view images. The mirror is made of stainless steel in order to avoid double reflections. The reflected areas of the image are considered as images taken by the virtual cameras. Only one camera calibration is needed since the virtual cameras have the same intrinsic parameters as the real camera. Depth is computed by a simple and accurate geometry based method once the corresponding points are identified. Correspondences are selected using a feature point based stereo matching process, including fast normalized cross-correlation and simulated annealing.
APA, Harvard, Vancouver, ISO, and other styles
34

Öhman, Adam. "The Calibrated SSVI Method - Implied Volatility Surface Construction." Thesis, KTH, Matematisk statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-257501.

Full text
Abstract:
In this thesis will the question of how to construct implied volatility surfaces in a robust and arbitrage free way be investigated. To be able to know if the solutions are arbitrage free was an initial investigation about arbitrage in volatility surfaces made. From this investigation where two comprehensive theorems found. These theorems came from Roper in \cite{Roper2010}. Based on these where then two applicable arbitrage tests created. These tests came to be very important tools in the remaining thesis.The most reasonable classes of models for modeling the implied volatility surface where then investigated. It was concluded that the classes that seemed to have the best potential where the stochastic volatility models and the parametric representation models. The choice between these two classes where concluded to be based on a trade-off between simplicity and quality of the result. If it where possible to make the parametric representation models improve its result the best applicable choice would be that class. For the remaining thesis was it therefore decided to investigate this class. The parametric representation model that was chosen to be investigated where the SVI parametrization family since it seemed to have the most potential outside of its already strong foundation.The SVI parametrization family is diveded into 3 parametrizations, the raw SVI parametrization, the SSVI parametrization and the eSSVI parametrization. It was concluded that the raw SVI parametrization even though it gives very good market fits, was not robust enough to be chosen. This ment that the raw SVI parametrization would in most cases generate arbitrage in its surfaces. The SSVI model was concluded to be a very strong model compared to the raw SVI, since it was able to generate completely arbitrage free solutions with good enough results. The eSSVI is an extended parametrization of the SSVI with purpose to improve its short maturity results. It was concluded to give small improvements but with the trade of making the optimization procedure harder. It was therefore concluded that the SSVI parametrization might be the better application.To try to improve the results of the SSVI parametrization was a complementary procedure developed which got named the calibrated SSVI method. This method compared to the eSSVI parametrization would not change the parametrization but instead focusing on calibrating the initial fit that the SSVI generated. This method would heavily improve the initial fit of the SSVI surface but was less robust since it generated harder cases for the interpolation and extrapolation.
I det här examensarbetet undersöks frågan om hur man bör modellera implied volatilitetsytor på ett robust och arbitragefritt sätt. För att kunna veta om lösningarna är arbigtragefria börjades arbetet med en undersökning inom arbitrageområdet. De mest heltäckande resultatet som hittades var två theorem av Roper i \cite{Roper2010}. Baserat på dessa theorem kunde två applicerbara arbitragetester skapas som sedan kom att bli en av hörnstenarna i detta arbete. Genom att undersöka de modellklasser som verkade vara de bästa inom området valdes den parametriseringsbeskrivande modellklassen.  I denna klass valdes sedan SVI parametriseringsfamiljen för vidare undersökning eftersom det verkade vara den familj av modeller som hade störst potential att uppnå jämnvikt mellan enkel applikation samt bra resultat.  För den klassiska SVI modellen i SVI familjen drogs slutsatsen att modellen inte var tillräcklig för att kunna rekommenderas. Detta berodde på att SVI modellen i princip alltid genererade lösningar med arbitrage i. SVI modellen genererar dock väldigt bra lösningar mot marknadsdatan enskilt och kan därför vara ett bra alternativ om man bara ska modellera ett implied volatilitetssmil. SSVI modellen ansågs däremot vara ett väldigt bra alternativ. SSVI modellen genererar komplett aribragefria lösningar men har samtidigt rimligt bra marknadspassning.  För att försöka förbättra resultaten från SSVI modellen, var en kompleterande metod kallad den kalibrerade SSVI metoden skapad. Denna metod kom att förbättra marknadspassningen som SSVI modellen genererade men som resultat kom robustheten att sjunka, då interpoleringen och extrapoleringen blev svårare att genomföra arbitragefritt.
APA, Harvard, Vancouver, ISO, and other styles
35

Zhao, Chang Sheng. "Reconstruction de surfaces tridimensionnelles en vision par ordinateur." Grenoble INPG, 1993. http://tel.archives-ouvertes.fr/tel-00343706.

Full text
Abstract:
Le cadre général de ce travail est l'étude de la représentation et la reconstruction de surfaces tridimensionnelles d'objets non polyédriques, lorsqu'ils sont observes avec des systèmes de vision passifs. Cette thèse est plus spécialement consacrée a la reconstruction de surfaces tridimensionnelles en utilisant des surfaces b-splines, a partir de l'observation du mouvement des contours occultant, dans une séquence d'images calibrées. Nous présentons plusieurs résultats auxquels nous avons abouti : premièrement, nous introduisons un algorithme pyramidal de suivi de contours courbes et la correction des anomalies de contours dans une sequence d'images. Deuxièmement, une methode originale de reconstruction de surfaces b-splines tridimensionnelles est développée a partir d'une séquence d'images calibrées. Finalement, nous proposons une approche de raccordement de différentes surfaces b-splines reconstruites en utilisant le maillage triangulaire et l'interpolation de surfaces. Des résultats sur des données synthétiques et des données réelles sont présentés. Les applications potentielles de ce travail sont très nombreuses dans le cadre des problèmes de modélisation d'objets non polyédriques dans une séquence d'images calibrées
APA, Harvard, Vancouver, ISO, and other styles
36

Choubanne, Imad. "Régularité des solutions des équations stationnaires de Boussinesq en présence de thermocapillarité à la surface du liquide et Méthodes d'éléments finis mixtes." Valenciennes, 2003. http://ged.univ-valenciennes.fr/nuxeo/site/esupversions/d209b2fe-1ce3-4576-89a5-fd8394892079.

Full text
Abstract:
Dans ce travail, nous étudions l'existence, la régularité des solutions des équations de Boussinesq stationnaires de thermoconvection (équations de Navier-Stokes couplées avec l'équation de la chaleur) avec thermocapillarité à la surface du liquide dans des polygones et des prismes ainsi que des méthodes d'éléments finis classiques et raffinés pour calculer des approximations des solutions. L'étude du problème de Boussinesq linéarisé et la théorie de l'interpolation permettent d'établir la régularité de ces solutions dans des espaces à poids qui prennent en compte leur comportement singulier prés des sommets et des arrêtes. En conséquence, l'utilisation d'une méthode d'éléments finis classiques n'aboutit pas en général à un ordre optimal de convergence. On montre que des raffinements de maillages appropriés prés des sommets en 2-D et des arrêtes en 3-D, permettent de prouver de reconstituer le taux optimal de convergence
In this work, we study the existence and regularity properties of the stationary Boussinesq equations of thermoconvection (Navier-Stokes equations coupled with the heat equation ) with thermocapillarity effect on the surface of the liquid in polygonal and prismatic domains as well as the classical and refined finite element methods to compute approximations of the solutions. The problem of the regularity of the solutions of the nonlinear problem is reduced to the regularity of the solution of the of linearized Boussinesq problem by putting the nonlinear terms in the right-hand sides and using interpolation theory. It is proved that the solution belongs to suitable weighted spaces which take into account their singularities behaviour near the vertices and edges. Consequently to this singular behaviour, the use of the classical finite element method does not lead in general to an optimal order of convergence. It is shown that appropriate mesh refinements near the vertices in 2-D respectively near the edges in 3-D, allow us to restore the optimal rate of convergence
APA, Harvard, Vancouver, ISO, and other styles
37

Yvart, Alex. "Modélisation Hiérarchique de Surfaces à partir de Maillages Polyédriques et Applications." Phd thesis, Grenoble INPG, 2004. http://tel.archives-ouvertes.fr/tel-00009003.

Full text
Abstract:
La conception et fabrication assistées par ordinateur (CFAO) est incontournable dans la création et l'élaboration d'un produit. De même, l'infographie est couramment employée en effets spéciaux ou réalité virtuelle. Pourtant, chaque domaine utilise ses propres outils pour modéliser des surfaces. Pour répondre aux contraintes de ces deux types d'utilisations, nous proposons une nouvelle modélisation de surfaces polynomiales, interpolantes, globalement lisses (de plan tangent continu) et hiérarchique. Une surface lisse initiale est d'abord construite sur un maillage triangulaire, ce qui permet de traiter toutes les topologies. Des détails sont ensuite progressivement ajoutés par niveaux. Pour ce faire, chaque face est subdivisée en quatre sous-faces par insertion d'un sommet en milieu d'arête. Ce raffinement n'induit pas de modifications significatives sur la surface. Chaque détail est défini localement par rapport au précédent grâce à l'utilisation de repères locaux. Cette hiérarchie permet à l'ensemble des détails fins de suivre continûment le déplacement lors d'une édition de la surface. Deux applications sont ensuite proposées dans cet ouvrage : Tout d'abord, un modeleur 3D respectant la démarche créative des artistes. Celui-ci repose sur le calcul d'une forme globale progressivement affinée pour obtenir l'objet désiré, et offre la possibilité d'éditer les objets de manière intuitive en modifiant très peu de paramètres. Enfin, un outil de reconstruction est présenté pour modéliser des objets existants grâce à notre nouvelle représentation hiérarchique.
APA, Harvard, Vancouver, ISO, and other styles
38

Flötotto, Julia. "Un système de coordonnées associé à un échantillon de points d'une variété: définition, propriétés et applications." Phd thesis, Université de Nice Sophia-Antipolis, 2003. http://tel.archives-ouvertes.fr/tel-00832487.

Full text
Abstract:
Dans de nombreux domaines d'applications, une variété plongée dans l'espace euclidien est souvent représentée par un échantillon de points. Nous définissons dans cette thèse un système de coordonnées associé à un tel échantillon sur la variété qui généralise les coordonnées naturelles définies par Sibson. Nous exhibons ses propriétés mathématiques fondamentales ainsi que son application à l'interpolation d'une fonction définie sur la variété. Nous introduisons la notion d'atlas de Voronoï, défini comme un ensemble de cellules approximant le diagramme de Voronoï restreint à la variété et montrons son application à la reconstruction de surface et au remaillage. Enfin, nous étendons les propriétés des coordonnées naturelles aux diagrammes de puissance et proposons une synthèse des méthodes d'interpolation par coordonnées naturelles. Cette dernière détaille des preuves omises dans les articles originaux.
APA, Harvard, Vancouver, ISO, and other styles
39

Cantarello, Luca. "Use of a Kalman filtering technique for near-surface temperature analysis." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/13455/.

Full text
Abstract:
A statistical post-processing of the hourly 2-meter temperature fields from the Nordic convective-scale operational Numerical Weather Prediction model Arome MetCoOp 2.5 Km has been developed and tested at the Norwegian Meteorological Institute (MET Norway). The objective of the work is to improve the representation of the temperature close to the surface combining model data and in-situ observations for climatological and hydrological applications. In particular, a statistical scheme based on a bias-aware Local Ensemble Transform Kalman Filter has been adapted to the spatial interpolation of surface temperature. This scheme starts from an ensemble of 2-meter temperature fields derived from Arome MetCoOp 2.5 Km and, taking into account the observations provided by the MET Norway network, produces an ensemble of analysis fields characterised by a grid spacing of 1 km. The model best estimate employed in the interpolation procedure is given by the latest avilable forecast, subsequently corrected for the model bias. The scheme has been applied off-line and the final analysis is performed independently at each grid point. The final analysis ensemble has been evaluated and its mean value has been proved to improve significantly the best estimate of Arome MetCoOp 2.5 km in representing the 2-meter temperature fields, in terms of both accuracy and precision, with a reduction in the root mean squared values as well as in the bias and an improvement in reproducing the cold extremes during wintertime. More generally, the analysis ensemble displays better forecast verification scores, with an overall reduction in the Brier Score and its reliability component and an increase in the resolution term for the zero degrees threshold. However, the final ensemble spread remains too narrow, though not as narrow as the model output.
APA, Harvard, Vancouver, ISO, and other styles
40

Maquart, Tristan. "Trivariate models generation from unstructured surface manifolds for isogeometric analysis : Application to reduced order modeling with geometric parameters." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSEI033.

Full text
Abstract:
Ce travail présente un cadre générique pour la construction de maillages isogéométriques volumiques à partir d'une géométrie complexe avec une topologie arbitraire pour des applications relatives aux modèles d'ordres réduits. En effet, les maillages structurés tels que les maillages isogéométriques ou hexahédriques sont difficiles à obtenir de manière automatique. Les analyses statistiques de formes et les modèles d'ordres réduits nécessitent des données structurées et ordonnées pour être construits efficacement. Pour ce faire, nous utilisons les limites du modèle solide triangulé, la B-Rep CAD (Boundary Representation in Computer Aided Design). Tout d'abord, cette thèse inclut une intégration d'un algorithme de décomposition en pantalons et en cuboïdes prenant en compte les caractéristiques géométriques. La décomposition en cuboïdes divise une surface en un ensemble de patchs quadrilatéraux qui peuvent aider à définir un volume associé. Des champs de croix (cross fields), c'est-à-dire des champs de directions à 4 symétries, sont utilisés pour guider une paramétrisation globale alignée de la surface. Ce paramétrage est optimisé afin de minimiser la distorsion des éléments. Le processus d'optimisation est pensé pour concevoir des champs de croix avec des contraintes topologiques et géométriques. En utilisant la décomposition optimisée en cuboïdes, une structure volumétrique est extraite. Sur la base de la paramétrisation globale et de la structure volumétrique précédemment calculée, une paramétrisation isogéométrique trivariée est déduite. Les propriétés topologiques invariantes sont analysées tout au long du processus proposé. Pour finir, pour différentes occurrences géométriques de même topologie mais possédant des géométries différentes, notre méthode permet d'avoir la même représentation : des maillages isotopologiques isogéométriques trivariés détenant la même connectivité. L'efficacité et la robustesse de l'approche proposée sont illustrées par plusieurs exemples de modèles d'ordres réduits en utilisant l'IGA (IsoGeometric Analysis)
This work presents a generic framework to construct trivariate isogeometric meshes of complicated geometry and arbitrary topology required for reduced order model applications. Indeed, structured meshes such as isogeometric or pure hexahedral ones are difficult to obtain in an automatic manner. Statistical shape analysis and reduced order modeling require structured and ordered data to be efficient. For that purpose, we use the triangulated solid 3D model's boundary provided from B-Rep CAD (Boundary-Representation in Computer Aided Design) models. Firstable, the workflow includes an integration of a geometry-feature-aware pants-to-cuboids decomposition algorithm. The input triangulated mesh is decomposed into a set of cuboids in two steps: pants decomposition and cuboid decomposition. Cuboid decomposition splits a surface into a set of quadrilateral patches which can define a volumetric layout of the associated boundary surface. Cross fields, i.e., 4-symmetry direction fields are used to guide a surface aligned global parameterization. Optimizing this parameterization, patches of the quadrilateral layout inherited from the cuboid decomposition are re-positioned on the surface in a way to achieve low overall distortion. The optimization process is thought to design cross fields with topological and geometrical constraints. Using the optimized cuboid decomposition, a volumetric layout is extracted. Based on the global parameterization and the structured volumetric layout previously computed, a trivariate isogeometric parameterization is deducted. Learning generalized forms of theorems in the topology field, invariant topological properties are analyzed throughout the proposed process. To finish, for different geometrical instances with the same topology but different geometries, our method allows to have the same representation: trivariate isogeometric isotopological meshes holding the same connectivity. The efficiency and the robustness of the proposed approach are illustrated through several examples of reduced order models using IGA (IsoGeometric Analysis)
APA, Harvard, Vancouver, ISO, and other styles
41

Lockyer, Peter Stephen. "Controlling the interpolation of NURBS curves and surfaces." Thesis, University of Birmingham, 2007. http://etheses.bham.ac.uk//id/eprint/6502/.

Full text
Abstract:
The primary focus of this thesis is to determine the best methods for controlling the interpolation of NURBS curves and surfaces. The various factors that affect the quality of the interpolant are described, and existing methods for controlling them are reviewed. Improved methods are presented for calculating the parameter values, derivative magnitudes, data point spacing and twist vectors, with the aim of producing high quality interpolants with minimal data requirements. A new technique for obtaining the parameter values and derivative magnitudes is evaluated, which constructs a C\(^1\) cubic spline with orthogonal first and second derivatives at specified parametric locations. When this data is used to create a C\(^2\) spline, the resulting interpolant is superior to those constructed using existing parameterisation and derivative magnitude estimation methods. Consideration is given to the spacing of data points, which has a significant impact on the quality of the interpolant. Existing methods are shown to produce poor results with curves that are not circles. Three new methods are proposed that significantly reduce the positional error between the interpolant and original geometry. For constrained surface interpolation, twist vectors must be estimated. A method is proposed that builds on the Adini method, and is shown to have improved error characteristics. In numerical tests, the new method consistently outperforms Adini. Interpolated surfaces are often required to join together smoothly along their boundaries. The constraints for joining surfaces with parametric and geometric continuity are discussed, and the problem of joining \(N\) patches to form an \(N\)-sided region is considered. It is shown that regions with odd \(N\) can be joined with G\(^1\) continuity, but those with even \(N\) or requiring G\(^2\) continuity can only be obtained for specific geometries.
APA, Harvard, Vancouver, ISO, and other styles
42

Nigro, Abdelmalek. "Algorithmes progressifs stables pour l'approximation de courbes et surfaces." Phd thesis, Université Joseph Fourier (Grenoble), 1995. http://tel.archives-ouvertes.fr/tel-00346056.

Full text
Abstract:
Dans cette étude, nous traitons un problème d'interpolation de données dans le plan ou dans l'espace. Ce problème se distingue des autres problèmes existants par le fait que les points sont obtenus progressivement, c'est-à-dire à un instant donné, seules sont connues les données jusqu'à cet instant. Les méthodes utilisées sont basées sur l'utilisation de splines récurrentes, i.e., chaque morceau de la spline à l'étape i (correspondant à l'information i) est calculé en fonction des morceaux précédents par un raccordement paramétrique ou géométrique. L'algorithme ainsi construit est régi par une relation de récurrence dont nous étudions la stabilité numérique. Les données ont été interpolées de deux manières différentes: ― par une fonction spline vectorielle: dans ce cas la stabilité est démontrée au moyen des paramètres de forme issus du raccordement géométrique entre les sections de la spline. Ainsi, nous avons donné un nouveau rôle aux paramètres de forme qui consiste à absorber les oscillations provenant du calcul itératif. ― par une représentation scalaire des données: chaque morceau de la spline appartient à un espace de dimension n, engendré par une famille de fonctions ayant certaines propriétés. On démontre que la stabilité est en fait obtenue par le choix même des fonctions de base de l'espace
APA, Harvard, Vancouver, ISO, and other styles
43

EL, FAKER ABDELLATIF. "Interpolation et approximation des surfaces par des box-splines." Rennes 1, 1991. http://www.theses.fr/1991REN10021.

Full text
Abstract:
L'approximation par des box-splines a deux variables est utilisee pour representer des formes dans un domaine tel que la synthese de l'image ou la fabrication assistee par ordinateur. Elle est aussi tres pratique dans d'autres domaines ou la continuite elevee de l'approximant est exigee. L'espace spline d'approximation est engendre par une box-spline et ses translatees, definies sur un domaine polygonal borne compatible avec un maillage regulier suivant trois directions ou de type 1. Une fois la dimension de l'espace spline calculee, on etudie le probleme d'unisolvance des schemas d'interpolation de lagrange et d'hermite par des box-splines. Une strategie et un algorithme de choix des points d'interpolation utilisant la theorie des graphes, et la notion de flux sont exposes. L'erreur d'interpolation est calculee pour des box-splines de continuite maximale, on obtient un ordre d'approximation maximale
APA, Harvard, Vancouver, ISO, and other styles
44

Ferguson, Neil Morris. "Continuous interpolations from crystalline to dynamically triangulated random surfaces." Thesis, University of Oxford, 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.239308.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Samozino, Marie. "Voronoi Centred Radial Basis Functions." Phd thesis, Université de Nice Sophia-Antipolis, 2007. http://tel.archives-ouvertes.fr/tel-00336379.

Full text
Abstract:
Cette thèse s'inscrit dans la problématique de la reconstruction de surfaces à partir de nuages de points. Les récentes avancées faites dans le domaine de l'acquisition de formes 3D à l'aide de scanners donnent lieu à de nouveaux besoins en termes d'algorithmes de reconstruction. Il faut être capable de traiter de grands nuages de points bruités tout en donnant une représentation compacte de la surface reconstruite.
La surface est reconstruite comme le niveau zéro d'une fonction. Représenter une surface implicitement en utilisant des fonctions de base radiales (Radial Basis Functions) est devenu une approche standard ces dix dernières années. Une problématique intéressante est la réduction du nombre de fonctions de base pour obtenir une représentation la plus compacte possible et réduire les temps d'évaluation.
Réduire le nombre de fonctions de base revient à réduire le nombre de points (centres) sur lesquels elles sont centrées. L'objectif que l'on s'est fixé consiste à sélectionner un "petit" ensemble de centres, les plus pertinents possible. Pour réduire le nombre de centres tout en gardant un maximum d'information, nous nous sommes affranchis de la correspondance entre centres des fonctions et points de donnée, qui est imposée dans la quasi-totalité des approches RBF. Au contraire, nous avons décidé de placer les centres sur l'axe médian de l'ensemble des points de donnée et de montrer que ce choix était approprié.
Pour cela, nous avons utilisé les outils donnés par la géométrie algorithmique et approximé l'axe médian par un sous-ensemble des sommets du diagramme de Voronoi des points de donnée. Nous avons aussi proposé deux approches différentes qui échantillonnent de manière appropriée l'axe médian pour adapter le niveau de détail de la surface reconstruite au budget de centres alloué par l'utilisateur.
APA, Harvard, Vancouver, ISO, and other styles
46

Almhdie, Imjabber Ahmad. "Comparaison locale des surfaces 3D : application à l’imagerie médicale." Orléans, 2007. http://www.theses.fr/2007ORLE2044.

Full text
Abstract:
Les travaux présentés dans cette thèse concernent la comparaison locale de surfaces médicales. Ils ont été conduits pour comparer précisément deux séquences de surfaces du ventricule gauche du cœur reconstruites à partir de deux modalités d’imagerie médicale. Les reconstructions de la première modalité (médecine nucléaire) ont servi de référence pour valider de nouvelles reconstructions obtenues par la seconde (échographie). La contribution théorique des travaux porte sur deux domaines du traitement de signal et des images : le recalage rigide et la reconstruction de surfaces fermées. Les travaux menés sur le recalage de surfaces se limitent au recalage rigide, en particulier à l’algorithme « Iterative Closest Point (ICP) », puisqu’on s’attache à mettre en évidence les écarts entre des reconstructions différentes de surfaces identiques. Une variante de l’algorithme ICP est proposée pour améliorer l’appariement de paires de points de correspondance dans les deux ensembles de données à recaler. La recherche vectorielle des distances minimum entre les paires de points est remplacée par une recherche matricielle, qui assure les propriétés d’injection nécessaire pour estimer les paramètres de rotation. Les performances de la nouvelle méthode sont étudiées en présence de différents types de bruit superposés aux données. La reconstruction de surfaces 3D incomplètes est utilisée pour comparer et visualiser des données qui présentent des échantillonnages irréguliers ou des résolutions différentes. Une nouvelle forme discrète du modèle plaque mince de la méthode de reconstruction basée sur un ajustement par splines variationnelles (VSF) améliore la performance de cette méthode et prend en compte des conditions de périodicité sur les surfaces reconstruites. Les résultats obtenus sont comparés avec ceux donnés par une analyse fréquentielle de Fourier. Ces travaux ont été complétés par la mise en œuvre d’une nouvelle méthode locale basée sur les splines quasi-interpolant. Les méthodes proposées ont été utilisées pour comparer localement différentes surfaces médicales : des séquences de surfaces du ventricule gauche du cœur, des reconstructions de l’enveloppe pulmonaire avec un atlas de référence, et des surfaces d’escarres identiques reconstruites à plusieurs résolutions par des techniques différentes.
APA, Harvard, Vancouver, ISO, and other styles
47

Costa, Gustavo Guilherme dos Santos. "Contribuição para os usuários de sistemas CAD/CAM/CNC em operação de fresamento de topo em aço para moldes e matrizes." Universidade Federal de Uberlândia, 2011. https://repositorio.ufu.br/handle/123456789/14885.

Full text
Abstract:
The increased in the demand for plastic products and the need for reduction manufacturing times, and the growing dependence of man on the computer in present day, especially in manufacturing activities, has resulted in constant research of technological developments in order to supply these needs. In dies and molds manufacturing industry for plastic injection requiring machining operations such as (milling, drilling and polishing, among others), is increasing dependence by computer systems, such as CAD/CAM. This technology helps in the manufacturing steps, provides fast and high accuracy in the manufacturing of complex geometries. Therefore, to understand and to use efficiently these resources that aided in manufacturing are of enormous importance for the optimization of a production process. In this context, this work presents a study on the use of resources programming CAD/CAM in milling operation of cavities steel VP50 with inserts cemented carbide with ball nose. The influence of two types of interpolation (linear and circular) and tolerances bands (0.05 mm and 0.1 mm) that define the tool path for machining of a cavity that has a similar form the a mold of a battery cover cell phone. As output variables evaluated were machining time, the number of lines of the program, the parameters of surface roughness (Ra, Rq, Rz) of parts, the radius of curvature, form deviation of any line profile and the tool wear. The results showed that, from the statistical point of view (ANOVA), none of the conditions of interpolation and tolerance employed significantly influenced the values of surface roughness in form deviation of any line and tool wear. Linear interpolation with tolerance of 0.1 mm was most viable for the production of such cavity in the conditions investigated because the produced good finish, low tool wear and machining time is shorter.
O aumento pela demanda por produtos plásticos e a necessidade de redução nos tempos de fabricação, além da crescente dependência do homem pelo computador em dias atuais, especialmente nas atividades de fabricação, tem implicado na busca por constantes desenvolvimentos tecnológicos a fim de suprir estas necessidades. Na indústria de fabricação de matrizes como também de moldes para injeção de plásticos (que necessitam de operações de usinagem tais como fresamento, furação e polimento, dentre outros), é cada vez maior a dependência pelos sistemas computacionais, como o sistema CAD/CAM. Essa tecnologia auxilia nas etapas de fabricação, oferece rapidez e alta exatidão na fabricação de geometrias complexas. Portanto, entender e saber utilizar de forma eficiente estes recursos que auxiliam na manufatura são de enorme importância para a otimização de um processo produtivo. Neste contexto, este trabalho apresenta um estudo sobre a utilização de recursos de programação CAD/CAM em fresamento de cavidades de aço VP50 com insertos de metal duro ponta de esférica. Foi investigada a influência de dois tipos de interpolações (linear e circular) e tolerâncias (0,05mm e 0,1 mm) que definem o percurso da ferramenta na usinagem de uma cavidade que possui forma semelhante a um molde da tampa da bateria do aparelho celular. Como variáveis de saída foram avaliadas o tempo de usinagem, o número de linhas do programa, os parâmetros de rugosidade superficial (Ra, Rq, Rz) das cavidades, o raio de curvatura, o desvio de forma de uma linha qualquer e o desgaste das ferramentas. Dos resultados obtidos, constatou-se que, sob o ponto de vista estatístico (ANOVA), nenhuma das condições de interpolação e tolerância empregadas influenciou significativamente nos valores de rugosidade da superfície, no desvio de forma de uma linha qualquer e desgaste das ferramentas. A interpolação linear com tolerância de 0,1mm mostrou-se a mais viável para a produção de tal cavidade nas condições investigadas devido a bom acabamento produzido, pequeno desgaste e tempo de usinagem mais curto.
Mestre em Engenharia Mecânica
APA, Harvard, Vancouver, ISO, and other styles
48

Beudaert, Xavier. "Commande numérique ouverte : interpolation optimisée pour l'usinage 5 axes grande vitesse des surfaces complexes." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2013. http://tel.archives-ouvertes.fr/tel-00918816.

Full text
Abstract:
Le processus de fabrication des pièces usinées arrive à maturité concernant la fabrication assistée par ordinateur et la maîtrise du procédé d'usinage. Aujourd'hui, les perspectives d'améliorations importantes sont liées à l'optimisation de la commande numérique et de ses interactions avec le reste du processus de fabrication. L'objectif de cette thèse est donc de maîtriser les briques de base de la commande numérique pour optimiser le processus d'usinage 5 axes grande vitesse des surfaces complexes. La création d'une commande numérique ouverte nécessite le développement des algorithmes qui transforment le programme d'usinage en consignes échantillonnées pour les axes de la machine. La première partie des travaux consiste à rendre la géométrie suffisamment continue notamment pour les trajets interpolés linéairement en 5 axes qui présentent des discontinuités en tangence. Ensuite, l'interpolation temporelle du trajet crée la trajectoire d'usinage respectant les contraintes cinématiques et en particulier le jerk de chacun des 5 axes de la machine. L'implémentation matérielle de ces algorithmes permet de piloter une machine d'usinage grande vitesse 5 axes avec une commande numérique ouverte. Ainsi, les verrous technologiques associés aux commandes numériques industrielles sont levés et la chaîne numérique est entièrement contrôlée de la CFAO jusqu'au déplacement des axes. La maîtrise complète de la commande numérique offre la possibilité de définir exactement le trajet d'usinage à partir de la CAO sans introduire les écarts géométriques inhérents aux formats de description standards. L'interpolation de la trajectoire d'usinage directement sur la surface à usiner améliore de manière significative la qualité et la productivité de l'usinage des surfaces complexes. La commande numérique PREMIUM-OpenCNC permet la validation expérimentale de ces travaux et ouvre de nombreuses autres voies d'amélioration du processus de fabrication.
APA, Harvard, Vancouver, ISO, and other styles
49

Apaydin, Gökşin. "Photochemistry of CCH : ab initio calculations, interpolation of the surfaces and state-to-state dynamics /." For electronic version search Digital dissertations database. Restricted to UC campuses. Access is free to UC campus dissertations, 2002. http://uclibs.org/PID/11984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Samozino, Marie. "Voronoi Centered Radial Basis Functions." Phd thesis, Université de Nice Sophia-Antipolis, 2007. http://tel.archives-ouvertes.fr/tel-00178274.

Full text
Abstract:
Cette thèse s´inscrit dans la problématique de la reconstruction de surfaces à partir de nuages de points. Les récentes avancées faites dans le domaine de l´acquisition de formes 3D à l´aide de scanners donnent lieu à de nouveaux besoins en termes d´algorithmes de reconstruction. Il faut être capable de traiter de grands nuages de points bruités tout en donnant une représentation compacte de la surface reconstruite. La surface est reconstruite comme le niveau zéro d´une fonction. Représenter une surface implicitement en utilisant des fonctions de base radiales (Radial Basis Functions) est devenu une approche standard ces dix dernières années. Une problématique intéressante est la réduction du nombre de fonctions de base pour obtenir une représentation la plus compacte possible et réduire les temps d´évaluation. Réduire le nombre de fonctions de base revient à réduire le nombre de points (centres) sur lesquels elles sont centrées. L´objectif que l´on s´est fixé consiste à sélectionner un "petit" ensemble de centres, les plus pertinents possible. Pour réduire le nombre de centres tout en gardant un maximum d´information, nous nous sommes affranchis de la correspondance entre centres des fonctions et points de donnée, qui est imposée dans la quasi-totalité des approches RBF. Au contraire, nous avons décidé de placer les centres sur l´axe médian de l´ensemble des points de donnée et de montrer que ce choix était approprié. Pour cela, nous avons utilisé les outils donnés par la géométrie algorithmique et approximé l´axe médian par un sous-ensemble des sommets du diagramme de Voronoi des points de donnée. Nous avons aussi proposé deux approches diférentes qui échantillonnent de manière appropriée l´axe médian pour adapter le niveau de détail de la surface reconstruite au budget de centres alloué par l´utilisateur.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography