Academic literature on the topic 'Makima interpolation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Makima interpolation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Makima interpolation"

1

Jaffar, Azhar, Norashikin M. Thamrin, Megat Syahirul Amin Megat Ali, Mohamad Farid Misnan, Ahmad Ihsan Mohd Yassin, and Noorolpadzilah Mohamed Zan. "Spatial interpolation method comparison for physico-chemical parameters of river water in Klang River using MATLAB." Bulletin of Electrical Engineering and Informatics 11, no. 4 (2022): 2368–77. http://dx.doi.org/10.11591/eei.v11i4.3615.

Full text
Abstract:
Water quality is one of the most highly debated issues worldwide at the moment. Inadequate water supplies affect human health, hinder food production, and degrade the environment. Using contemporary technology to analyze pollution statistics can help solve pollution issues. One option is to take advantage of advancements in intelligent data processing to conduct hydrological parameter analysis. To perform conclusive water quality studies, a lot of data is necessary. Unfilled data (information gaps) in the long-term hydrological data set may be due to equipment faults, collection schedule delays, or the data collection officer’s absence. The lack of hydrological data skews its interpretation. Therefore, interpolation is used to recreate and fill missing hydrological data. From 2012 to 2017, the Klang River’s biochemical oxygen demand (BOD) in Selangor, Malaysia, was sampled. This study examined three methods of interpolation for their effectiveness using the MATLAB software: piecewise cubic hermite interpolating polynomial (PCHIP), cubic Spline data interpolation (Spline), and modified Akima partitioned cubic hermite interpolation (Makima). The accuracy is assessed using root mean square error (RMSE). All interpolation algorithms offer excellent results with low RMSE. However, PCHIP delivers the best match between interpolated and original data.
APA, Harvard, Vancouver, ISO, and other styles
2

Azhar, Jaffar, M. Thamrin Norashikin, Syahirul Amin Megat Ali Megat, Farid Misnan Mohamad, Ihsan Mohd Yassin Ahmad, and Mohamed Zan Noorolpadzilah. "Spatial interpolation method comparison for physico-chemical parameters of river water in Klang River using MATLAB." Bulletin of Electrical Engineering and Informatics 11, no. 4 (2022): 2368~2377. https://doi.org/10.11591/eei.v11i4.3615.

Full text
Abstract:
Water quality is one of the most highly debated issues worldwide at the moment. Inadequate water supplies affect human health, hinder food production, and degrade the environment. Using contemporary technology to analyze pollution statistics can help solve pollution issues. One option is to take advantage of advancements in intelligent data processing to conduct hydrological parameter analysis. To perform conclusive water quality studies, a lot of data is necessary. Unfilled data (information gaps) in the long-term hydrological data set may be due to equipment faults, collection schedule delays, or the data collection officer’s absence. The lack of hydrological data skews its interpretation. Therefore, interpolation is used to recreate and fill missing hydrological data. From 2012 to 2017, the Klang River’s biochemical oxygen demand (BOD) in Selangor, Malaysia, was sampled. This study examined three methods of interpolation for their effectiveness using the MATLAB software: piecewise cubic hermite interpolating polynomial (PCHIP), cubic Spline data interpolation (Spline), and modified Akima partitioned cubic hermite interpolation (Makima). The accuracy is assessed using root mean square error (RMSE). All interpolation algorithms offer excellent results with low RMSE. However, PCHIP delivers the best match between interpolated and original data.
APA, Harvard, Vancouver, ISO, and other styles
3

Kumar, Rohit, Subrata Bhattacharya, and Govind Murmu. "Exploring Optimality of Piecewise Polynomial Interpolation Functions for Lung Field Modeling in 2D Chest X-Ray Images." Frontiers in Physics 9 (November 3, 2021). http://dx.doi.org/10.3389/fphy.2021.770752.

Full text
Abstract:
In this paper, a landmark based approach, using five different interpolating polynomials (linear, cubic convolution, cubic spline, PCHIP, and Makima) for modeling of lung field region in 2D chest X-ray images have been presented. Japanese Society of Radiological Technology (JSRT) database which is publicly available has been used for evaluation of the proposed method. Selected radiographs are anatomically landmarked using 17 and 16 anatomical landmark points to represent left and right lung field regions, respectively. Local, piecewise polynomial interpolation is then employed to create additional semilandmark points to form the lung contour. Jaccard similarity coefficients and Dice coefficients have been used to find accuracy of the modeled shape through comparison with the prepared ground truth. With the optimality condition of three intermediate semilandmark points, PCHIP interpolation method with an execution time of 5.04873 s is found to be the most promising candidate for lung field modeling with an average Dice coefficient (DC) of 98.20 and 98.54% (for the left and right lung field, respectively) and with the average Jaccard similarity coefficient (JSC) of 96.47 and 97.13% for these two lung field regions. While performance of Makima and cubic convolution is close to the PCHIP with the same optimality condition, i.e., three intermediate semilandmark points, the optimality condition for the cubic spline method is of at least seven intermediate semilandmark points which, however, does not result in better performance in terms of accuracy or execution time.
APA, Harvard, Vancouver, ISO, and other styles
4

Smith, Matthew G., Graham M. Gibson, and Manlio Tassieri. "i-RheoFT: Fourier transforming sampled functions without artefacts." Scientific Reports 11, no. 1 (2021). http://dx.doi.org/10.1038/s41598-021-02922-8.

Full text
Abstract:
AbstractIn this article we present a new open-access code named “i-RheoFT” that implements the analytical method first introduced in [PRE, 80, 012501 (2009)] and then enhanced in [New J Phys 14, 115032 (2012)], which allows to evaluate the Fourier transform of any generic time-dependent function that vanishes for negative times, sampled at a finite set of data points that extend over a finite range, and need not be equally spaced. I-RheoFT has been employed here to investigate three important experimental factors: (i) the ‘density of initial experimental points’ describing the sampled function, (ii) the interpolation function used to perform the “virtual oversampling” procedure introduced in [New J Phys 14, 115032 (2012)], and (iii) the detrimental effect of noises on the expected outcomes. We demonstrate that, at relatively high signal-to-noise ratios and density of initial experimental points, all three built-in MATLAB interpolation functions employed in this work (i.e., Spline, Makima and PCHIP) perform well in recovering the information embedded within the original sampled function; with the Spline function performing best. Whereas, by reducing either the number of initial data points or the signal-to-noise ratio, there exists a threshold below which all three functions perform poorly; with the worst performance given by the Spline function in both the cases and the least worst by the PCHIP function at low density of initial data points and by the Makima function at relatively low signal-to-noise ratios. We envisage that i-RheoFT will be of particular interest and use to all those studies where sampled or time-averaged functions, often defined by a discrete set of data points within a finite time-window, are exploited to gain new insights on the systems’ dynamics.
APA, Harvard, Vancouver, ISO, and other styles
5

Zhu, Jun, Yuanda Zhang, Guangxian Yang, and Shuxian Liu. "Assessing different models to predict the growth and development of pepper plants under water deficits." Frontiers in Plant Science 15 (December 11, 2024). https://doi.org/10.3389/fpls.2024.1436209.

Full text
Abstract:
To construct pepper development simulation models under drought, experiments of water capacities of 45–55%, 55–65%, 65–75% or 75–85% and exposure (2, 4, 6 or 8 d) (Exp. 1 & 2), of 50–60%, 60–70% or 70–80% and exposure (3, 5, and 7 d) (Exp. 3) were conducted with “Sanying” pepper. Physiological development time (PDT), product of thermal effectiveness and PAR (photosynthetically active radiation) (TEP) and growing degree days (GDD) were used to simulate growth under various treatments in Exp. 1. Plant development was influenced by the severity and drought duration. Mild water deficits (65–75% for 2–6 d or 55–65% for 2–4 d) accelerated development, while severe water deficits (65–75% for 8 d, 55–65% for 6–8 d or 45–55% for 2–8 d) delayed development. The PDT gave the highest coefficient of determination (R2, 0.89–0.94) and the lowest root mean squared error (RMSE, average of 1.03–1.50 d) and relative error (RE, average of 1.60–1.88%) for simulating three growth periods (Exp. 2). It was therefore used to construct growth models under water capacity of 45–85% over 2–8 d with spline, cubic, makima, linear, and nearest interpolation. Validation in Exp. 3 indicated that the spline model was optimal, having the highest R2 (0.96–0.97) and the lowest RMSE (average of 1.31–1.75 d) and RE (average of 1.18–2.06%). The results of the study can help producers to optimize water management and to develop drought strategies for production.
APA, Harvard, Vancouver, ISO, and other styles
6

Henderson, John, and Robert Peeling. "A framework for early-stage sustainability assessment of innovation projects enabled by weighted sum multi-criteria decision analysis in the presence of uncertainty." Open Research Europe, August 5, 2024. https://doi.org/10.12688/openreseurope.18195.1.

Full text
Abstract:
A two-level hierarchical framework for early-stage sustainability assessment (FESSA) amongst a set of alternatives applicable from the earliest stages of process or product development is introduced, and its use in combination with an improved method weighted-sum method multi-criteria decision analysis (WSM-MCDA) in the presence of uncertainty is presented through application to a case study based upon a real-world decision scenario from speciality polymer manufacture. The approach taken addresses the challenge faced by those responsible for innovation management in the manufacturing process industries to make simultaneously timely and rational decisions early in the innovation cycle when knowledge gaps and uncertainty about the options tend to be at their highest. The Computed Uncertainty Range Evaluations (CURE) WSM-MCDA method provides better discrimination than the existing Multiple Attribute Range Evaluations (MARE) method without the computational burden of generating heuristic outcome distributions via Monte-Carlo simulation.This paper introduces a framework that teams can use to think systematically about the wide range of criteria which go into deciding whether a proposed innovation enhances sustainability or not and shows how an improved method for multiple-criteria decision analysis can be used to put it into practice with an example drawn from the speciality chemicals industry.Innovation in the manufacturing process industries requires decisions to be made. In individual projects, scientists and technical managers must decide which technology, materials, and equipment to use. Equally, those responsible for directing a portfolio of projects must choose which projects to prioritise. In either case, early decision making is desirable to avoid sinking time and money into dead-end projects, and to identify what further work is needed for projects with a future. The earlier you decide however, the harder it can be to obtain firm evidence (e.g. conclusive experimental data, fully validated costings, or life cycle impacts) upon which to base your decision. The growing societal expectation that sustainability criteria are factored into such decisions merely adds to the challenges faced by the decision maker.Decisions must be made upon the evidence that is available combined with the informed judgement of those with knowledge of the system under consideration. This is best approached as a facilitated, team-based activity where assertions, assumptions and interpolations or extrapolations from the limited data can be tested and challenged. A sound decision-making process needs a suitable computational method for turning this complex qualitative and semi-quantitative assessment into a clear output indicator of potential success or failure for the options under consideration. The method described in this paper addresses this need but, just as importantly, the methodology ensures that the thought process behind whatever decision is indicated is clearly and transparently documented for future reference.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Makima interpolation"

1

Peng, Song. "RESTORE PCM TELEMETRY SIGNAL WAVEFORM BY MAKING USE OF MULTI-SAMPLE RATE INTERPOLATION TECHNOLOGY." International Foundation for Telemetering, 1999. http://hdl.handle.net/10150/607318.

Full text
Abstract:
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada<br>There are two misty understandings about PCM telemetry system in conventional concept: Waveform can not be restored accurately; to be restored accurately, a measured signal must be sampled at a higher sample rate. This paper discusses that by making use of multi-sample rate DSP technology, the sample rate of a measured signal can be reduced in transmission equipment, or system precision can be retained even if the performance of low pass filter declined.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Makima interpolation"

1

Bond, W. Soil Physical Methods for Estimating Recharge - Part 3. CSIRO Publishing, 1998. http://dx.doi.org/10.1071/9780643105355.

Full text
Abstract:
Measurements in and just below the plant root zone, using principles of soil physics, can be used to estimate recharge. This booklet describes the Zero Flux Plane Method, Methods Based on Darchy's law, and Lysimetry for making such estimates. The work presents the basic concepts of soil water physics that will be referred to in this and other booklets in the series. Another method, the Soil Water Flux Meter, is discussed briefly, but as this is not sufficiently well developed for routine use readers are referred elsewhere for full details. All these methods require that consideration be given to interpolation over time and spatial extrapolation or averaging. A brief discussion of this is given.
APA, Harvard, Vancouver, ISO, and other styles
2

Wikle, Christopher K. Spatial Statistics. Oxford University Press, 2018. http://dx.doi.org/10.1093/acrefore/9780190228620.013.710.

Full text
Abstract:
The climate system consists of interactions between physical, biological, chemical, and human processes across a wide range of spatial and temporal scales. Characterizing the behavior of components of this system is crucial for scientists and decision makers. There is substantial uncertainty associated with observations of this system as well as our understanding of various system components and their interaction. Thus, inference and prediction in climate science should accommodate uncertainty in order to facilitate the decision-making process. Statistical science is designed to provide the tools to perform inference and prediction in the presence of uncertainty. In particular, the field of spatial statistics considers inference and prediction for uncertain processes that exhibit dependence in space and/or time. Traditionally, this is done descriptively through the characterization of the first two moments of the process, one expressing the mean structure and one accounting for dependence through covariability.Historically, there are three primary areas of methodological development in spatial statistics: geostatistics, which considers processes that vary continuously over space; areal or lattice processes, which considers processes that are defined on a countable discrete domain (e.g., political units); and, spatial point patterns (or point processes), which consider the locations of events in space to be a random process. All of these methods have been used in the climate sciences, but the most prominent has been the geostatistical methodology. This methodology was simultaneously discovered in geology and in meteorology and provides a way to do optimal prediction (interpolation) in space and can facilitate parameter inference for spatial data. These methods rely strongly on Gaussian process theory, which is increasingly of interest in machine learning. These methods are common in the spatial statistics literature, but much development is still being done in the area to accommodate more complex processes and “big data” applications. Newer approaches are based on restricting models to neighbor-based representations or reformulating the random spatial process in terms of a basis expansion. There are many computational and flexibility advantages to these approaches, depending on the specific implementation. Complexity is also increasingly being accommodated through the use of the hierarchical modeling paradigm, which provides a probabilistically consistent way to decompose the data, process, and parameters corresponding to the spatial or spatio-temporal process.Perhaps the biggest challenge in modern applications of spatial and spatio-temporal statistics is to develop methods that are flexible yet can account for the complex dependencies between and across processes, account for uncertainty in all aspects of the problem, and still be computationally tractable. These are daunting challenges, yet it is a very active area of research, and new solutions are constantly being developed. New methods are also being rapidly developed in the machine learning community, and these methods are increasingly more applicable to dependent processes. The interaction and cross-fertilization between the machine learning and spatial statistics community is growing, which will likely lead to a new generation of spatial statistical methods that are applicable to climate science.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Makima interpolation"

1

Favorskaya, Alena V. "Interpolation on Unstructured Triangular Grids." In Innovations in Wave Processes Modelling and Decision Making. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76201-2_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Favorskaya, Alena V. "Interpolation on Unstructured Tetrahedral Grids." In Innovations in Wave Processes Modelling and Decision Making. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76201-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Favorskaya, Alena V. "Piecewise Linear Interpolation on Unstructured Tetrahedral Grids." In Innovations in Wave Processes Modelling and Decision Making. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76201-2_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Chengyuan, and Qiang Shen. "Rough-Fuzzy Rule Interpolation for Data-Driven Decision Making." In Advances in Intelligent Systems and Computing. Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-87094-2_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lieber, Jean, Emmanuel Nauer, Henri Prade, and Gilles Richard. "Making the Best of Cases by Approximation, Interpolation and Extrapolation." In Case-Based Reasoning Research and Development. Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-01081-2_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Boutry, Nicolas, Thierry Géraud, and Laurent Najman. "On Making nD Images Well-Composed by a Self-dual Local Interpolation." In Advanced Information Systems Engineering. Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-319-09955-2_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bhattacharjee, Kalyan Shankar, Hemant Kumar Singh, and Tapabrata Ray. "Enhanced Pareto Interpolation Method to Aid Decision Making for Discontinuous Pareto Optimal Fronts." In AI 2017: Advances in Artificial Intelligence. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-63004-5_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Osinin, Ilya. "Optimization of the Hardware Costs of Interpolation Converters for Calculations in the Logarithmic Number System." In Recent Research in Control Engineering and Decision Making. Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-12072-6_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Nešić, Ivan, Pavle Milošević, Aleksandar Rakicevic, Bratislav Petrović, and Dragan G. Radojević. "Modeling Candlestick Patterns with Interpolative Boolean Algebra for Investment Decision Making." In Soft Computing Applications. Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-33941-7_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Inoue, Masato, Yosuke Kawasaki, Takuma Suzuki, Yuta Washimi, Tsutomu Tanimoto, and Masaki Takahashi. "Point Cloud Interpolation by RGB Image to Estimate Road Surface Profile for Preview Suspension Control." In Lecture Notes in Mechanical Engineering. Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-70392-8_95.

Full text
Abstract:
AbstractThe growing prevalence of autonomous driving is expected to shift passengers’ attention from driving, increasing the demand for enhanced ride comfort. Studies addressing ride comfort have prominently explored active suspension control with recent research on preview suspension control using on-board sensors. The proposed systems often include LiDAR deployment at the front for high-precision road surface profiles. However, these systems often involve costly sensors such as LiDAR, making it impractical for on-board installation. Nonetheless, in recent autonomous vehicles, LiDAR tend to be mounted on the roof. It would be beneficial to leverage this LiDAR for preview control, the point cloud obtained from the roof has insufficient density to accurately perceive the unevenness on the road surface. To overcome the low-density issue in point cloud data obtained from less channels LiDAR, this study applies a supervised machine learning model, developed for autonomous driving, to estimate road surface profiles and enhance the precision of these estimations.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Makima interpolation"

1

Costa, Giancarlo, Maria Pina Limongelli, and Sebastian Thöns. "Innovation through Value of Information of the Interpolation Damage Detection Method: an experiment-based value quantification." In IABSE Congress, San José 2024: Beyond Structural Engineering in a Changing World. International Association for Bridge and Structural Engineering (IABSE), 2024. https://doi.org/10.2749/sanjose.2024.1101.

Full text
Abstract:
&lt;p&gt;Innovation plays a crucial role in shaping technological, industrial, and social progress in modern societies. The quantification of the benefits technologies offer in specific decision-making scenarios can foster their innovation process providing a guide in their development and diffusion. Decision Value Analysis provides metrics to assess the value of technologies for increasing maturity levels. Specifically, the Value of Information can be used to assess and optimize information acquirement and processing technologies. In this study, the Value of Information is applied to guide the development of a technology for damage detection based on experimental data collected from the S101 bridge in Austria. Prior, posterior, and pre- posterior utilities from the Bayesian Decision theory are calculated accounting for the integrity management decision scenario, maintenance actions, experimental data, and costs. The utilities are maximized to identify the optimal range of damage detection performance parameters and to provide input for the development of a business model for technology utilization.&lt;/p&gt;
APA, Harvard, Vancouver, ISO, and other styles
2

Pannell, Jared, and Naga Vijayasankaran. "Cathodic Protection Area of Influence Study." In CORROSION 2021. AMPP, 2021. https://doi.org/10.5006/c2021-16877.

Full text
Abstract:
Abstract As Cathodic Protection (CP) becomes more and more of a common practice to reduce the risk of external corrosion, areas are becoming congested with CP systems. This is seen predominantly in shared right-of-ways and in plant facilities. In a plant facility, every underground asset can be tied together through piping and electrical grounding. If test point locations do not meet criteria it can be challenging to determine which CP system should be adjusted to provide proper cathodic protection, without negatively impacting other adjacent continuous sub-surface assets. Area of influence testing is a way to identify the amount of IR (voltage drop) from a single CP system that most affects the sub-surface asset. This test can only be performed if the soil is considered homogeneous in resistivity, so R (resistance due to soil resistivity) becomes a constant. This is where an area of influence study can benefit the owner/operator of the assets. Testing is done by turning all rectifiers Off and allowing the area to depolarize. Then a single rectifier is cycled On and Off, so that the area cannot polarize, and the IR drop can be recorded accurately at every test point in the area. The step is repeated for every rectifier one by one for the areal extent of the study Readings taken at every test point location indicate the magnitude of influence of the individual CP system that is interrupted at that geographic location. Using Geographic Information System (GIS) technology, discrete test point readings are converted to a continuous interpolated surface using spatial interpolation algorithms such as Inverse Distance Weighted interpolation and Kriging which can be overlaid on imagery. This visual representation allows engineers to investigate the geographical extent and variance of the CP system influence on underground assets. Cumulative effects of multiple rectifiers in any given region can also be visualized in a similar fashion. In plant facilities where there are hundreds of CP units, area of influence maps can highlight areas of concern and aids the decision-making process to balance potentials as required to meet criteria.
APA, Harvard, Vancouver, ISO, and other styles
3

Su, Jie, Cong Tian, and Zhenhua Duan. "Conditional interpolation: making concurrent program verification more effective." In ESEC/FSE '21: 29th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering. ACM, 2021. http://dx.doi.org/10.1145/3468264.3468602.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chen, Chengyuan, Guorong Chen, and Lixiao Feng. "Multi-expert decision making using rough-fuzzy rule interpolation." In 2016 IEEE International Conference of Online Analysis and Computing Science (ICOACS). IEEE, 2016. http://dx.doi.org/10.1109/icoacs.2016.7563054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Chengyuan, and Qiang Shen. "OWA-based fuzzy rule interpolation for group decision making." In 2014 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE). IEEE, 2014. http://dx.doi.org/10.1109/fuzz-ieee.2014.6891804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tian, Haoyue, Pan Gao, and Xiaojiang Peng. "Video Frame Interpolation Based on Deformable Kernel Region." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/188.

Full text
Abstract:
Video frame interpolation task has recently become more and more prevalent in the computer vision field. At present, a number of researches based on deep learning have achieved great success. Most of them are either based on optical flow information, or interpolation kernel, or a combination of these two methods. However, these methods have ignored that there are grid restrictions on the position of kernel region during synthesizing each target pixel. These limitations result in that they cannot well adapt to the irregularity of object shape and uncertainty of motion, which may lead to irrelevant reference pixels used for interpolation. In order to solve this problem, we revisit the deformable convolution for video interpolation, which can break the fixed grid restrictions on the kernel region, making the distribution of reference points more suitable for the shape of the object, and thus warp a more accurate interpolation frame. Experiments are conducted on four datasets to demonstrate the superior performance of the proposed model in comparison to the state-of-the-art alternatives.
APA, Harvard, Vancouver, ISO, and other styles
7

Craciunescu, Oana I., Shiva K. Das, Terrence Z. Wong, and Thaddeus V. Samulski. "Fractal Reconstruction of Breast Perfusion Before and After Hyperthermia Treatments." In ASME 2002 International Mechanical Engineering Congress and Exposition. ASMEDC, 2002. http://dx.doi.org/10.1115/imece2002-33692.

Full text
Abstract:
Thermal modeling for hyperthermia breast patients can provide relevant information to better understand the temperatures achieved during treatment. However, human breast is much perfused, making knowledge of the perfusion crucial to the accuracy of the temperature computations. It has been shown that the perfusion of blood in tumor tissue can be approximated using the relative perfusion index (RPI) determined from dynamic contrast-enhanced magnetic resonance imaging (DE-MRI). It was also concluded that the 3D reconstruction of tumor perfusion can be performed using fractal interpolation functions (FIF). The technique used was called piecewise hidden variable fractal interpolation (PHVFI). Changes in the protocol parameters for the dynamic MRI sequences in breast patients allowed us to be able to acquire more spatial slices, hence the possibility to actually verify the accuracy of the fractal interpolation. The interpolated slices were compared to the imaged slices in the original set. The accuracy of the interpolation was tested on post-hyperthermia treatment data set. The difference between the reconstruction and the original slice varied from 2 to 5%. Significantly, the fractal dimension of the interpolated slices is within 2–3% from the original images, thus preserving the fractality of the perfusion maps. The use of such a method becomes crucial when tumor size and imaging restrictions limits the number of spatial slices, requiring interpolation to fill the data between the slices.
APA, Harvard, Vancouver, ISO, and other styles
8

Bhavana, S. R., T. Radhakrishnan, and Anju S. "SPATIAL PREDICTION OF SOIL ORGANIC CARBON IN PANAMARAM BLOCK, KERALA - A COMPARITIVE STUDY OF VARIOUS TECHNIQUES." In Second International Conference in Civil Engineering for a Sustainable Planet. AIJR Publisher, 2025. https://doi.org/10.21467/proceedings.179.18.

Full text
Abstract:
Soil Organic Carbon (SOC) is the carbon that is contained within soil organic matter and it plays a main role to soil health, fertility and ecosystem services, which includes food production – making its preservation and restoration essential for sustainable agriculture development. SOC plays an important role in the global carbon cycle and, consequently, in climate change mitigation and adaptation. Sustainable management of SOC in the soil demands the scientific knowledge of its spatial distribution. In this study, we used five spatial interpolation algorithms to spatially predict the SOC in the Panamaram Block of Wayanad District in Kerala. Five spatial interpolation algorithms, including two variants of Kriging, were applied in the soil nutrient data. The performance of the prediction algorithms was compared using prediction accuracy parameters such as R2, RMSE and MAE. Experimental results revealed that predictions generated by Inverse Distance Weighting (IDW) exhibit the highest accuracies with an R2 value of 99.25%, followed by the Radial Basis Polynomial Interpolation Method (67.03%), Ordinary Kriging (26.79%), Local Polynomial Interpolation (18.96%), and Simple Kriging Method (13.07%), respectively. This research contributes valuable insights into understanding the spatial distribution of Soil Organic Carbon which is pivotal for informed land management and environmental conservation strategies.
APA, Harvard, Vancouver, ISO, and other styles
9

Qiu, Yan, Kai He, Haitao Fang, and Dazhuang Zhu. "Multi-axis jewelry-making system and its four-axis interpolation algorithm." In 2015 IEEE International Conference on Information and Automation (ICIA). IEEE, 2015. http://dx.doi.org/10.1109/icinfa.2015.7279589.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gonza´lez, Francisco, Manuel Gonza´lez, and Javier Cuadrado. "Weak Coupling of Multibody Dynamics and Block Diagram Simulation Tools." In ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-86653.

Full text
Abstract:
Dynamic simulation of complex mechatronic systems can be carried out in an efficient and modular way making use of weakly coupled co-simulation setups. When using this approach, multirate methods are often needed to improve the efficiency, since the physical components of the system usually have different frequencies and time scales. However, most multirate methods have been designed for strongly coupled setups, and their application in weakly coupled co-simulations is not straightforward due to the limitations enforced by the commercial simulation tools used for mechatronics design. This work describes a weakly coupled multirate method applied to combine a block diagram simulator (Simulink) with a multibody dynamics simulator in a co-simulation setup. A double-mass triple-spring system with known analytical solution is used as test problem in order to investigate the behavior of the method as a function of the frequency ratio (FR) of the coupled subsystems. Several synchronization schemes (fastest-first and slowest-first) and interpolation/extrapolation methods (polynomials of different order and smoothing) have been tested. Results show that the slowest-first methods deliver the best results, combined with a cubic interpolation (for FR &amp;lt; 25) or without interpolation (for 25 &amp;lt; FR &amp;lt; 50). For FR &amp;gt; 50, none of the tested methods can deliver precise results, although smoothing techniques can reduce interpolation errors for certain situations.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Makima interpolation"

1

Kingston, A. W., A. Mort, C. Deblonde, and O H Ardakani. Hydrogen sulfide (H2S) distribution in the Triassic Montney Formation of the Western Canadian Sedimentary Basin. Natural Resources Canada/CMSS/Information Management, 2022. http://dx.doi.org/10.4095/329797.

Full text
Abstract:
The Montney Formation is a highly productive hydrocarbon reservoir with significant reserves of hydrocarbon gases and liquids making it of great economic importance to Canada. However, high concentrations of hydrogen sulfide (H2S) have been encountered during exploration and development that have detrimental effects on environmental, health, and economics of production. H2S is a highly toxic and corrosive gas and therefore it is essential to understand the distribution of H2S within the basin in order to enhance identification of areas with a high risk of encountering elevated H2S concentrations in order to mitigate against potential negative impacts. Gas composition data from Montney wells is routinely collected by operators for submission to provincial regulators and is publicly available. We have combined data from Alberta (AB) and British Columbia (BC) to create a basin-wide database of Montney H2S concentrations. We then used an iterative quality control and quality assurance process to produce a dataset that best represents gas composition in reservoir fluids. This included: 1) designating gas source formation based on directional surveys using a newly developed basin-wide 3D model incorporating AGS's Montney model of Alberta with a model in BC, which removes errors associated with reported formations; 2) removed injection and disposal wells; 3) assessed wells with the 50 highest H2S concentrations to determine if gas composition data is accurate and reflective of reservoir fluid chemistry; and 4) evaluated spatially isolated extreme values to ensure data accuracy and prevent isolated highs from negatively impacting data interpolation. The resulting dataset was then used to calculate statistics for each x, y location to input into the interpolation process. Three interpolations were constructed based on the associated phase classification: H2S in gas, H2S in liquid (C7+), and aqueous H2S. We used Empirical Bayesian Kriging interpolation to generate H2S distribution maps along with a series of model uncertainty maps. These interpolations illustrate that H2S is heterogeneously distributed across the Montney basin. In general, higher concentrations are found in AB compared with BC with the highest concentrations in the Grande Prairie region along with several other isolated region in the southeastern portion of the basin. The interpolations of H2S associated with different phases show broad similarities. Future mapping research will focus on subdividing intra-Montney sub-members plus under- and overlying strata to further our understanding of the role migration plays in H2S distribution within the Montney basin.
APA, Harvard, Vancouver, ISO, and other styles
2

McMartin, I., M. S. Gauthier, and A. V. Page. Updated post-glacial marine limits along western Hudson Bay, central mainland Nunavut and northern Manitoba. Natural Resources Canada/CMSS/Information Management, 2022. http://dx.doi.org/10.4095/330940.

Full text
Abstract:
A digital compilation of updated postglacial marine limits was completed in the coastal regions of central mainland Nunavut and northern Manitoba between Churchill and Queen Maud Gulf. The compilation builds on and updates previous mapping of the marine limits at an unprecedented scale, making use of high-resolution digital elevation models, new field-based observations of the marine limit and digital compilations of supporting datasets (i.e. marine deltas and marine sediments). The updated mapping also permits a first-hand, knowledgedriven interpolation of a continuous limit of marine inundation linking the Tyrrell Sea to Arctic Ocean seawaters. The publication includes a detailed description of the mapping methods, a preliminary interpretation of the results, and a GIS scalable layout map for easy access to the various layers. These datasets and outputs provide robust constraints to reconstruct the patterns of ice retreat and for glacio-isostatic rebound models, important for the estimation of relative sea level changes and impacts on the construction of nearshore sea-transport infrastructures. They can also be used to evaluate the maximum extent of marine sediments and associated permafrost conditions that can affect land-based infrastructures, and potential secondary processes related to marine action in the surficial environment and, therefore, can enhance the interpretation of geochemical anomalies in glacial drift exploration methods. A generalized map of the maximum limit of postglacial marine inundation produced for map representation and readability also constitutes an accessible output relevant to Northerners and other users of geoscience data.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography