Academic literature on the topic 'Interpolator testing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Interpolator testing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Interpolator testing"

1

Petráček, Petr, Petr Fojtů, Tomáš Kozlok, and Matěj Sulitka. "Effect of CNC Interpolator Parameter Settings on Toolpath Precision and Quality in Corner Neighborhoods." Applied Sciences 12, no. 19 (2022): 9496. http://dx.doi.org/10.3390/app12199496.

Full text
Abstract:
Surface quality, machining time, and precision of the final workpiece are key criteria of optimization in CNC machining. These criteria are influenced by multiple factors, such as path interpolation, feed drive system settings, machine dynamics, and the manufacturing process. The properties of the output of the interpolator indirectly influence all subsequent phases of the machining process, thus influencing the quality of the end product. This paper focuses on the effects of interpolator settings on toolpath quality and precision in corner neighborhoods for the commercial Heidenhain iTNC interpolator. A novel method of toolpath quality evaluation suitable for interpolator output toolpaths is proposed, and the effect of multiple CNC parameters on toolpath quality and precision in corner neighborhoods is quantified based on results obtained on a testing toolpath and verified on a toolpath composed of linear segments only. Both toolpath quality and precision were found to depend primarily on the parameters of limit frequency, contour tolerance, and corner jerk settings with precision additionally depending on angle size. The results show that both toolpath quality and precision in corner neighborhoods can be successfully controlled by the corner jerk limit parameter settings. The presented methodology provides a practical guide for CNC parameter settings in Heidenhain interpolators aimed at predicting toolpath quality and precision in corner neighborhoods.
APA, Harvard, Vancouver, ISO, and other styles
2

Dutra e Silva Júnior, Élvio Carlos, Leandro Soares Indrusiak, Weiler Alves Finamore, and Manfred Glesner. "A Programmable Look-Up Table-Based Interpolator with Nonuniform Sampling Scheme." International Journal of Reconfigurable Computing 2012 (2012): 1–14. http://dx.doi.org/10.1155/2012/647805.

Full text
Abstract:
Interpolation is a useful technique for storage of complex functions on limited memory space: some few sampling values are stored on a memory bank, and the function values in between are calculated by interpolation. This paper presents a programmable Look-Up Table-based interpolator, which uses a reconfigurable nonuniform sampling scheme: the sampled points are not uniformly spaced. Their distribution can also be reconfigured to minimize the approximation error on specific portions of the interpolated function’s domain. Switching from one set of configuration parameters to another set, selected on the fly from a variety of precomputed parameters, and using different sampling schemes allow for the interpolation of a plethora of functions, achieving memory saving and minimum approximation error. As a study case, the proposed interpolator was used as the core of a programmable noise generator—output signals drawn from different Probability Density Functions were produced for testing FPGA implementations of chaotic encryption algorithms. As a result of the proposed method, the interpolation of a specific transformation function on a Gaussian noise generator reduced the memory usage to 2.71% when compared to the traditional uniform sampling scheme method, while keeping the approximation error below a threshold equal to 0.000030518.
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Yu, and Jie Liu. "Chord Error Closed-Loop Controlled NURBS Interpolator and Trajectory Planning." Applied Mechanics and Materials 16-19 (October 2009): 925–29. http://dx.doi.org/10.4028/www.scientific.net/amm.16-19.925.

Full text
Abstract:
Focusing on the problem of NURBS curve interpolation in high speed manufacture, a new trajectory planning algorithm, which is suitable for chord error closed loop controlled interpolator is proposed. This tragjectory can determine accelerating, decelerating or maintenance last velocity in the next period via judging the braking distance. By the way of testing different calculating time under different CPU core, the real-time characteristic is validated. The simulation shows that chord error closed loop interpolator can automatically adjust the velocity to satisfying the precision demand, through calculating the curvature. In addition, it can assure that the maximal velocity and the acceleration were equal to the referenced parameters and machine runs with the dynamic characteristic of operator set completely.
APA, Harvard, Vancouver, ISO, and other styles
4

Rodriguez-Perez, Daniel, and Noela Sanchez-Carnero. "Multigrid/Multiresolution Interpolation: Reducing Oversmoothing and Other Sampling Effects." Geomatics 2, no. 3 (2022): 236–53. http://dx.doi.org/10.3390/geomatics2030014.

Full text
Abstract:
Traditional interpolation methods, such as IDW, kriging, radial basis functions, and regularized splines, are commonly used to generate digital elevation models (DEM). All of these methods have strong statistical and analytical foundations (such as the assumption of randomly distributed data points from a gaussian correlated stochastic surface); however, when data are acquired non-homogeneously (e.g., along transects) all of them show over/under-smoothing of the interpolated surface depending on local point density. As a result, actual information is lost in high point density areas (caused by over-smoothing) or artifacts appear around uneven density areas (“pimple” or “transect” effects). In this paper, we introduce a simple but robust multigrid/multiresolution interpolation (MMI) method which adapts to the spatial resolution available, being an exact interpolator where data exist and a smoothing generalizer where data are missing, but always fulfilling the statistical requirement that surface height mathematical expectation at the proper working resolution equals the mean height of the data at that same scale. The MMI is efficient enough to use K-fold cross-validation to estimate local errors. We also introduce a fractal extrapolation that simulates the elevation in data-depleted areas (rendering a visually realistic surface and also realistic error estimations). In this work, MMI is applied to reconstruct a real DEM, thus testing its accuracy and local error estimation capabilities under different sampling strategies (random points and transects). It is also applied to compute the bathymetry of Gulf of San Jorge (Argentina) from multisource data of different origins and sampling qualities. The results show visually realistic surfaces with estimated local validation errors that are within the bounds of direct DEM comparison, in the case of the simulation, and within the 10% of the bathymetric surface typical deviation in the real calculation.
APA, Harvard, Vancouver, ISO, and other styles
5

Evangelista, Ivan Roy S., Lenmar T. Catajay, Maria Gemel B. Palconit, et al. "Detection of Japanese Quails (Coturnix japonica) in Poultry Farms Using YOLOv5 and Detectron2 Faster R-CNN." Journal of Advanced Computational Intelligence and Intelligent Informatics 26, no. 6 (2022): 930–36. http://dx.doi.org/10.20965/jaciii.2022.p0930.

Full text
Abstract:
Poultry, like quails, is sensitive to stressful environments. Too much stress can adversely affect birds’ health, causing meat quality, egg production, and reproduction to degrade. Posture and behavioral activities can be indicators of poultry wellness and health condition. Animal welfare is one of the aims of precision livestock farming. Computer vision, with its real-time, non-invasive, and accurate monitoring capability, and its ability to obtain a myriad of information, is best for livestock monitoring. This paper introduces a quail detection mechanism based on computer vision and deep learning using YOLOv5 and Detectron2 (Faster R-CNN) models. An RGB camera installed 3 ft above the quail cages was used for video recording. The annotation was done in MATLAB video labeler using the temporal interpolator algorithm. 898 ground truth images were extracted from the annotated videos. Augmentation of images by change of orientation, noise addition, manipulating hue, saturation, and brightness was performed in Roboflow. Training, validation, and testing of the models were done in Google Colab. The YOLOv5 and Detectron2 reached average precision (AP) of 85.07 and 67.15, respectively. Both models performed satisfactorily in detecting quails in different backgrounds and lighting conditions.
APA, Harvard, Vancouver, ISO, and other styles
6

Rodrigues, Daniel, Carl Duchesne, and Julien Lauzon-Gauthier. "Interpolation of Pathway Based Non-Destructive Testing (NDT) Data for Defect Detection and Localization in Pre-Baked Carbon Anodes." Metals 12, no. 9 (2022): 1411. http://dx.doi.org/10.3390/met12091411.

Full text
Abstract:
Producing consistent quality pre-baked carbon anodes for the Hall–Héroult aluminum reduction process is challenging due to the decreasing quality and increasing variability of anode raw materials. Non-destructive testing techniques (NDT) have been developed and recently implemented in manufacturing plants to establish better suited and more efficient quality control schemes than core sampling and characterization. These technologies collect measurements representing effective properties of the materials located along a pathway between two transducers (emitter and receiver), and not spatially-resolved distribution of properties within the anode volume. A method to interpolate pathway-based measurements and provide spatially-resolved distribution of properties is proposed in this work to help NDT technologies achieve their full potential. The interpolation method is tested by simulating acousto-ultrasonic data collected from a large number of 2D and 3D toy examples representing simplified anode internal structures involving randomly generated defects. Experimental validation was performed by characterizing core samples extracted from a set of industrial anodes and correlating their properties with interpolated speed of sound by the algorithm. The method is shown to be successful in determining the defect positions, and the interpolated results are shown to correlate significantly with mechanical properties.
APA, Harvard, Vancouver, ISO, and other styles
7

Kim, Seungjun, Junghoon Jin, and Jongsun Kim. "A Cost-Effective and Compact All-Digital Dual-Loop Jitter Attenuator for Built-Off-Test Applications." Electronics 11, no. 21 (2022): 3630. http://dx.doi.org/10.3390/electronics11213630.

Full text
Abstract:
A compact and low-power all-digital CMOS dual-loop jitter attenuator (DJA) for low-cost built-off-test (BOT) applications such as parallel multi-DUT testing is presented. The proposed DJA adopts a new digital phase interpolator (PI)-based clock recovery (CR) loop with an adaptive decimation filter (ADF) function to remove the jitter and phase noise of the input clock, and generate a phase-aligned clean output clock. In addition, by adopting an all-digital multi-phase multiplying delay-locked loop (MDLL), eight low-jitter evenly spaced reference clocks that are required for the PI are generated. In the proposed DJA, both the MDLL and PI-based CR are first-order systems, and so this DJA has the advantage of high system stability. In addition, the proposed DJA has the benefit of a wide operating frequency range, unlike general PLL-based jitter attenuators that have a narrow frequency range and a jitter peaking problem. Implemented in a 40 nm 0.9 V CMOS process, the proposed DJA generates cleaned programmable output clock frequencies from 2.4 to 4.7 GHz. Furthermore, it achieves a peak-to-peak and RMS jitter attenuation of –25.6 dB and –32.6 dB, respectively, at 2.4 GHz. In addition, it occupies an active area of only 0.0257 mm2 and consumes a power of 7.41 mW at 2.4 GHz.
APA, Harvard, Vancouver, ISO, and other styles
8

DeGaetano, Arthur T., and Brian N. Belcher. "Spatial Interpolation of Daily Maximum and Minimum Air Temperature Based on Meteorological Model Analyses and Independent Observations." Journal of Applied Meteorology and Climatology 46, no. 11 (2007): 1981–92. http://dx.doi.org/10.1175/2007jamc1536.1.

Full text
Abstract:
Abstract Hourly meteorological forecast model initializations are used to guide the spatial interpolation of daily cooperative network station data in the northeastern United States. The hourly model data are transformed to daily maximum and minimum temperature values and interpolated to the station points after standardization to station elevation based on the model temperature lapse rate. The resulting bias (interpolation − observation) is computed and then interpolated back to the model grids, allowing daily adjustment of the temperature fields based on independent observations. These adjusted data can then be interpolated to the resolution of interest. For testing, the data are interpolated to stations that were withheld during the construction of the bias field. The use of the model initializations as a basis for interpolation improves upon the conventional interpolation of elevation-adjusted station data alone. When inverse-distance-weighted interpolation is used in conjunction with data from a 40-km-model grid, mean annual absolute errors averaged 5% smaller than those from interpolation of station data alone for maximum and minimum temperature, which is a significant decrease. Using data from a 20-km-model grid reduces mean absolute error during June by 10% for maximum temperature and 16% for minimum temperature. Adjustment for elevation based on the model temperature lapse rate improved the interpolation of maximum temperature, but had little effect on minimum temperature. Winter minimum temperature errors were related to snow depth, a feature that likely contributed to the relatively high autocorrelation exhibited by the daily errors.
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Fen-Jiao, Chang-Lin Mei, Zhi Zhang, and Qiu-Xia Xu. "Testing for Local Spatial Association Based on Geographically Weighted Interpolation of Geostatistical Data with Application to PM2.5 Concentration Analysis." Sustainability 14, no. 21 (2022): 14646. http://dx.doi.org/10.3390/su142114646.

Full text
Abstract:
Using local spatial statistics to explore local spatial association of geo-referenced data has attracted much attention. As is known, a local statistic is formulated at a particular sampling unit based on a prespecific proximity relationship and the observations in the neighborhood of this sampling unit. However, geostatistical data such as meteorological data and air pollution data are generally collected from meteorological or monitoring stations which are usually sparsely located or highly clustered over space. For such data, a local spatial statistic formulated at an isolate sampling point may be ineffective because of its distant neighbors, or the statistic is undefinable in the sub-regions where no observations are available, which limits the comprehensive exploration of local spatial association over the whole studied region. In order to overcome the predicament, a local-linear geographically weighted interpolation method is proposed in this paper to obtain the predictors of the underlying spatial process on a lattice spatial tessellation, on which a local spatial statistic can be well formulated at each interpolation point. Furthermore, the bootstrap test is suggested to identify the locations where local spatial association is significant using the interpolated-value-based local spatial statistics. Simulation with comparison to some existing interpolation and test methods is conducted to assess the performance of the proposed interpolation and the suggested test methods and a case study based on PM2.5 concentration data in Guangdong province, China, is used to demonstrate their applicability. The results show that the proposed interpolation method performs accurately in retrieving an underlying spatial process and the bootstrap test with the interpolated-value-based local statistics is powerful in identifying local patterns of spatial association.
APA, Harvard, Vancouver, ISO, and other styles
10

Rimon, Y., E. R. Graber, and A. Furman. "Interpolation of extensive routine water pollution monitoring datasets: methodology and discussion of implications for aquifer management." Hydrology and Earth System Sciences Discussions 10, no. 7 (2013): 9363–87. http://dx.doi.org/10.5194/hessd-10-9363-2013.

Full text
Abstract:
Abstract. A large fraction of the fresh water available for human use is stored in groundwater aquifers. Since human activities such as mining, agriculture, industry and urbanization often result in incursion of various pollutants to groundwater, routine monitoring of water quality is an indispensable component of judicious aquifer management. Unfortunately, groundwater pollution monitoring is expensive and usually cannot cover an aquifer with the spatial resolution necessary for making adequate management decisions. Interpolation of monitoring data between points is thus an important tool for supplementing measured data. However, interpolating routine groundwater pollution data poses a special problem due to the nature of the observations. The data from a producing aquifer usually includes many zero pollution concentration values from the clean parts of the aquifer but may span a wide range (up to a few orders of magnitude) of values in the polluted areas. This manuscript presents a methodology that can cope with such datasets and use them to produce maps that present the pollution plumes but also delineates the clean areas that are fit for production. A method for assessing the quality of mapping in a way which is suitable to the data's dynamic range of values is also presented. Local variant of inverse distance weighting is employed to interpolate the data. Inclusion zones around the interpolation points ensure that only relevant observations contribute to each interpolated concentration. Using inclusion zones improves the accuracy of the mapping but results in interpolation grid points which are not assigned a value. That inherent trade-off between the interpolation accuracy and coverage is demonstrated using both circular and elliptical inclusion zones. A leave-one-out cross testing is used to assess and compare the performance of the interpolations. The methodology is demonstrated using groundwater pollution monitoring data from the Coastal aquifer along the Israeli shoreline.
APA, Harvard, Vancouver, ISO, and other styles
More sources
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography