Academic literature on the topic 'Iteratively reweighted least squares'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Iteratively reweighted least squares.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Iteratively reweighted least squares"

1

Chen, Colin. "Distributed iteratively reweighted least squares and applications." Statistics and Its Interface 6, no. 4 (2013): 585–93. http://dx.doi.org/10.4310/sii.2013.v6.n4.a15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

O’Leary, Dianne P. "Robust Regression Computation Using Iteratively Reweighted Least Squares." SIAM Journal on Matrix Analysis and Applications 11, no. 3 (July 1990): 466–80. http://dx.doi.org/10.1137/0611032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ba, Demba, Behtash Babadi, Patrick L. Purdon, and Emery N. Brown. "Robust spectrotemporal decomposition by iteratively reweighted least squares." Proceedings of the National Academy of Sciences 111, no. 50 (December 2, 2014): E5336—E5345. http://dx.doi.org/10.1073/pnas.1320637111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dollinger, Michael B., and Robert G. Staudte. "Influence Functions of Iteratively Reweighted Least Squares Estimators." Journal of the American Statistical Association 86, no. 415 (September 1991): 709–16. http://dx.doi.org/10.1080/01621459.1991.10475099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Daubechies, Ingrid, Ronald DeVore, Massimo Fornasier, and C. Si̇nan Güntürk. "Iteratively reweighted least squares minimization for sparse recovery." Communications on Pure and Applied Mathematics 63, no. 1 (January 2010): 1–38. http://dx.doi.org/10.1002/cpa.20303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sigl, Juliane. "Nonlinear residual minimization by iteratively reweighted least squares." Computational Optimization and Applications 64, no. 3 (February 2, 2016): 755–92. http://dx.doi.org/10.1007/s10589-016-9829-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Merli, Marcello, and Luciana Sciascia. "Iteratively reweighted least squares in crystal structure refinements." Acta Crystallographica Section A Foundations of Crystallography 67, no. 5 (July 20, 2011): 456–68. http://dx.doi.org/10.1107/s0108767311023622.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Guo, Jianfeng. "Analytical quality assessment of iteratively reweighted least-squares (IRLS) method." Boletim de Ciências Geodésicas 20, no. 1 (March 2014): 132–41. http://dx.doi.org/10.1590/s1982-21702014000100009.

Full text
Abstract:
The iteratively reweighted least-squares (IRLS) technique has been widely employed in geodetic and geophysical literature. The reliability measures are important diagnostic tools for inferring the strength of the model validation. An exact analytical method is adopted to obtain insights on how much iterative reweighting can affect the quality indicators. Theoretical analyses and numerical results show that, when the downweighting procedure is performed, (1) the precision, all kinds of dilution of precision (DOP) metrics and the minimal detectable bias (MDB) will become larger; (2) the variations of the bias-to-noise ratio (BNR) are involved, and (3) all these results coincide with those obtained by the first-order approximation method.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Zhi-Min, Shan Chen, and Yi-Zeng Liang. "Baseline correction using adaptive iteratively reweighted penalized least squares." Analyst 135, no. 5 (2010): 1138. http://dx.doi.org/10.1039/b922045c.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pires, R. C., A. Simoes Costa, and L. Mili. "Iteratively reweighted least-squares state estimation through Givens Rotations." IEEE Transactions on Power Systems 14, no. 4 (1999): 1499–507. http://dx.doi.org/10.1109/59.801941.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Iteratively reweighted least squares"

1

Popov, Dmitriy. "Iteratively reweighted least squares minimization with prior information a new approach." Master's thesis, University of Central Florida, 2011. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4822.

Full text
Abstract:
Iteratively reweighted least squares (IRLS) algorithms provide an alternative to the more standard l[sub1]-minimization approach in compressive sensing. Daubechies et al. introduced a particularly stable version of an IRLS algorithm and rigorously proved its convergence in 2010. They did not, however, consider the case in which prior information on the support of the sparse domain of the solution is available. In 2009, Miosso et al. proposed an IRLS algorithm that makes use of this information to further reduce the number of measurements required to recover the solution with specified accuracy. Although Miosso et al. obtained a number of simulation results strongly confirming the utility of their approach, they did not rigorously establish the convergence properties of their algorithm. In this paper, we introduce prior information on the support of the sparse domain of the solution into the algorithm of Daubechies et al. We then provide a rigorous proof of the convergence of the resulting algorithm.
ID: 030646220; System requirements: World Wide Web browser and PDF reader.; Mode of access: World Wide Web.; Thesis (M.S.)--University of Central Florida, 2011.; Includes bibliographical references (p. 37-38).
M.S.
Masters
Mathematics
Sciences
Mathematical Science
APA, Harvard, Vancouver, ISO, and other styles
2

Sigl, Juliane [Verfasser], Massimo [Akademischer Betreuer] Fornasier, Rachel [Gutachter] Ward, Sergei [Gutachter] Pereverzyev, and Massimo [Gutachter] Fornasier. "Iteratively Reweighted Least Squares - Nonlinear Regression and Low-Dimensional Structure Learning for Big Data / Juliane Sigl ; Gutachter: Rachel Ward, Sergei Pereverzyev, Massimo Fornasier ; Betreuer: Massimo Fornasier." München : Universitätsbibliothek der TU München, 2018. http://d-nb.info/1160034850/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Palkki, Ryan D. "Chemical identification under a poisson model for Raman spectroscopy." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/45935.

Full text
Abstract:
Raman spectroscopy provides a powerful means of chemical identification in a variety of fields, partly because of its non-contact nature and the speed at which measurements can be taken. The development of powerful, inexpensive lasers and sensitive charge-coupled device (CCD) detectors has led to widespread use of commercial and scientific Raman systems. However, relatively little work has been done developing physics-based probabilistic models for Raman measurement systems and crafting inference algorithms within the framework of statistical estimation and detection theory. The objective of this thesis is to develop algorithms and performance bounds for the identification of chemicals from their Raman spectra. First, a Poisson measurement model based on the physics of a dispersive Raman device is presented. The problem is then expressed as one of deterministic parameter estimation, and several methods are analyzed for computing the maximum-likelihood (ML) estimates of the mixing coefficients under our data model. The performance of these algorithms is compared against the Cramer-Rao lower bound (CRLB). Next, the Raman detection problem is formulated as one of multiple hypothesis detection (MHD), and an approximation to the optimal decision rule is presented. The resulting approximations are related to the minimum description length (MDL) approach to inference. In our simulations, this method is seen to outperform two common general detection approaches, the spectral unmixing approach and the generalized likelihood ratio test (GLRT). The MHD framework is applied naturally to both the detection of individual target chemicals and to the detection of chemicals from a given class. The common, yet vexing, scenario is then considered in which chemicals are present that are not in the known reference library. A novel variation of nonnegative matrix factorization (NMF) is developed to address this problem. Our simulations indicate that this algorithm gives better estimation performance than the standard two-stage NMF approach and the fully supervised approach when there are chemicals present that are not in the library. Finally, estimation algorithms are developed that take into account errors that may be present in the reference library. In particular, an algorithm is presented for ML estimation under a Poisson errors-in-variables (EIV) model. It is shown that this same basic approach can also be applied to the nonnegative total least squares (NNTLS) problem. Most of the techniques developed in this thesis are applicable to other problems in which an object is to be identified by comparing some measurement of it to a library of known constituent signatures.
APA, Harvard, Vancouver, ISO, and other styles
4

Guo, Mengmeng. "Generalized quantile regression." Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2012. http://dx.doi.org/10.18452/16569.

Full text
Abstract:
Die generalisierte Quantilregression, einschließlich der Sonderfälle bedingter Quantile und Expektile, ist insbesondere dann eine nützliche Alternative zum bedingten Mittel bei der Charakterisierung einer bedingten Wahrscheinlichkeitsverteilung, wenn das Hauptinteresse in den Tails der Verteilung liegt. Wir bezeichnen mit v_n(x) den Kerndichteschätzer der Expektilkurve und zeigen die stark gleichmßige Konsistenzrate von v-n(x) unter allgemeinen Bedingungen. Unter Zuhilfenahme von Extremwerttheorie und starken Approximationen der empirischen Prozesse betrachten wir die asymptotischen maximalen Abweichungen sup06x61 |v_n(x) − v(x)|. Nach Vorbild der asymptotischen Theorie konstruieren wir simultane Konfidenzb änder um die geschätzte Expektilfunktion. Wir entwickeln einen funktionalen Datenanalyseansatz um eine Familie von generalisierten Quantilregressionen gemeinsam zu schätzen. Dabei gehen wir in unserem Ansatz davon aus, dass die generalisierten Quantile einige gemeinsame Merkmale teilen, welche durch eine geringe Anzahl von Hauptkomponenten zusammengefasst werden können. Die Hauptkomponenten sind als Splinefunktionen modelliert und werden durch Minimierung eines penalisierten asymmetrischen Verlustmaßes gesch¨atzt. Zur Berechnung wird ein iterativ gewichteter Kleinste-Quadrate-Algorithmus entwickelt. Während die separate Schätzung von individuell generalisierten Quantilregressionen normalerweise unter großer Variablit¨at durch fehlende Daten leidet, verbessert unser Ansatz der gemeinsamen Schätzung die Effizienz signifikant. Dies haben wir in einer Simulationsstudie demonstriert. Unsere vorgeschlagene Methode haben wir auf einen Datensatz von 150 Wetterstationen in China angewendet, um die generalisierten Quantilkurven der Volatilität der Temperatur von diesen Stationen zu erhalten
Generalized quantile regressions, including the conditional quantiles and expectiles as special cases, are useful alternatives to the conditional means for characterizing a conditional distribution, especially when the interest lies in the tails. We denote $v_n(x)$ as the kernel smoothing estimator of the expectile curves. We prove the strong uniform consistency rate of $v_{n}(x)$ under general conditions. Moreover, using strong approximations of the empirical process and extreme value theory, we consider the asymptotic maximal deviation $\sup_{ 0 \leqslant x \leqslant 1 }|v_n(x)-v(x)|$. According to the asymptotic theory, we construct simultaneous confidence bands around the estimated expectile function. We develop a functional data analysis approach to jointly estimate a family of generalized quantile regressions. Our approach assumes that the generalized quantiles share some common features that can be summarized by a small number of principal components functions. The principal components are modeled as spline functions and are estimated by minimizing a penalized asymmetric loss measure. An iteratively reweighted least squares algorithm is developed for computation. While separate estimation of individual generalized quantile regressions usually suffers from large variability due to lack of sufficient data, by borrowing strength across data sets, our joint estimation approach significantly improves the estimation efficiency, which is demonstrated in a simulation study. The proposed method is applied to data from 150 weather stations in China to obtain the generalized quantile curves of the volatility of the temperature at these stations
APA, Harvard, Vancouver, ISO, and other styles
5

Barreto, Jose Antonio. "L(p)-approximation by the iteratively reweighted least squares method and the design of digital FIR filters in one dimension." Thesis, 1993. http://hdl.handle.net/1911/13689.

Full text
Abstract:
In this thesis a new and simple to program approach is proposed in order to obtain an $L\sb{p}$ approximation, 2 $<$ p $<$ $\infty$, based on the Iteratively Reweighted Least Squares (IRLS) method, for designing a linear phase digital finite impulse response (FIR) filter. This technique, interesting in its own right, can also be used as an intermediate design method between the least squared error and the minimum Chebyshev error criteria. Various IRLS algorithms are evaluated through comparison of the number of iterations required for convergence. It is shown that Kahng's (or Fletcher's et al) method with a modified acceleration technique developed in this work performs better, for most practical cases, than the other algorithms examined. A filter design method which allows different norms in different bands is proposed and implemented. An important extension of this method also considers the case of different p's (or different norms) in the stopband.
APA, Harvard, Vancouver, ISO, and other styles
6

Masák, Tomáš. "Velká data - extrakce klíčových informací pomocí metod matematické statistiky a strojového učení." Master's thesis, 2017. http://www.nusl.cz/ntk/nusl-357228.

Full text
Abstract:
This thesis is concerned with data analysis, especially with principal component analysis and its sparse modi cation (SPCA), which is NP-hard-to- solve. SPCA problem can be recast into the regression framework in which spar- sity is usually induced with ℓ1-penalty. In the thesis, we propose to use iteratively reweighted ℓ2-penalty instead of the aforementioned ℓ1-approach. We compare the resulting algorithm with several well-known approaches to SPCA using both simulation study and interesting practical example in which we analyze voting re- cords of the Parliament of the Czech Republic. We show experimentally that the proposed algorithm outperforms the other considered algorithms. We also prove convergence of both the proposed algorithm and the original regression-based approach to PCA. vi
APA, Harvard, Vancouver, ISO, and other styles
7

"Synthetic Aperture Radar Image Formation Via Sparse Decomposition." Master's thesis, 2011. http://hdl.handle.net/2286/R.I.9211.

Full text
Abstract:
abstract: Spotlight mode synthetic aperture radar (SAR) imaging involves a tomo- graphic reconstruction from projections, necessitating acquisition of large amounts of data in order to form a moderately sized image. Since typical SAR sensors are hosted on mobile platforms, it is common to have limitations on SAR data acquisi- tion, storage and communication that can lead to data corruption and a resulting degradation of image quality. It is convenient to consider corrupted samples as missing, creating a sparsely sampled aperture. A sparse aperture would also result from compressive sensing, which is a very attractive concept for data intensive sen- sors such as SAR. Recent developments in sparse decomposition algorithms can be applied to the problem of SAR image formation from a sparsely sampled aperture. Two modified sparse decomposition algorithms are developed, based on well known existing algorithms, modified to be practical in application on modest computa- tional resources. The two algorithms are demonstrated on real-world SAR images. Algorithm performance with respect to super-resolution, noise, coherent speckle and target/clutter decomposition is explored. These algorithms yield more accu- rate image reconstruction from sparsely sampled apertures than classical spectral estimators. At the current state of development, sparse image reconstruction using these two algorithms require about two orders of magnitude greater processing time than classical SAR image formation.
Dissertation/Thesis
M.S. Electrical Engineering 2011
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Iteratively reweighted least squares"

1

Samejima, Masaki, and Yasuyuki Matsushita. "Fast General Norm Approximation via Iteratively Reweighted Least Squares." In Computer Vision – ACCV 2016 Workshops, 207–21. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-54427-4_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Marx, Brian D. "Iterative Reweighted Partial Least Squares Estimation for GLMs." In Statistical Modelling, 169–76. New York, NY: Springer New York, 1995. http://dx.doi.org/10.1007/978-1-4612-0789-4_21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Knight, Keith. "A Continuous-Time Iteratively Reweighted Least Squares Algorithm for $$L_\infty $$ L ∞ Estimation." In Springer Proceedings in Mathematics & Statistics, 59–68. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-28665-1_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dollinger, Michael B., and Robert G. Staudte. "Efficiency of Reweighted Least Squares Iterates." In Directions in Robust Statistics and Diagnostics, 61–65. New York, NY: Springer New York, 1991. http://dx.doi.org/10.1007/978-1-4615-6861-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mukundan, Arun, Giorgos Tolias, and Ondřej Chum. "Robust Data Whitening as an Iteratively Re-weighted Least Squares Problem." In Image Analysis, 234–47. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-59126-1_20.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

"Iteratively Reweighted Least Squares." In Probability Methods for Cost Uncertainty Analysis, 477–83. Chapman and Hall/CRC, 2015. http://dx.doi.org/10.1201/b19143-23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

McCullagh, Peter. "WHAT CAN GO WRONG WITH ITERATIVELY RE-WEIGHTED LEAST SQUARES?" In Multilevel Analysis of Educational Data, 147–57. Elsevier, 1989. http://dx.doi.org/10.1016/b978-0-12-108840-8.50013-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gill, Jeff, and Kenneth J. Meier. "The Theory and Application of Generalized Substantively Reweighted Least Squares 1." In What Works, 41–58. Routledge, 2018. http://dx.doi.org/10.4324/9780429503108-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Iteratively reweighted least squares"

1

Li, Shuang, Qiuwei Li, Gang Li, Xiongxiong He, and Liping Chang. "Iteratively reweighted least squares for block-sparse recovery." In 2014 IEEE 9th Conference on Industrial Electronics and Applications (ICIEA). IEEE, 2014. http://dx.doi.org/10.1109/iciea.2014.6931321.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Kaihui, Liangtian Wan, and Feiyu Wang. "Fast Iteratively Reweighted Least Squares Minimization for Sparse Recovery." In 2018 IEEE 23rd International Conference on Digital Signal Processing (DSP). IEEE, 2018. http://dx.doi.org/10.1109/icdsp.2018.8631827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ince, Taner, Nurdal Watsuji, and Arif Nacaroglu. "Iteratively reweighted least squares minimization for sparsely corrupted measurements." In 2011 IEEE 19th Signal Processing and Communications Applications Conference (SIU). IEEE, 2011. http://dx.doi.org/10.1109/siu.2011.5929657.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shuang, Li. "Sparse Representation of Hardy Function by Iteratively Reweighted Least Squares." In 2020 International Symposium on Computer Engineering and Intelligent Communications (ISCEIC). IEEE, 2020. http://dx.doi.org/10.1109/isceic51027.2020.00020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chen, Chen, Junzhou Huang, Lei He, and Hongsheng Li. "Preconditioning for Accelerated Iteratively Reweighted Least Squares in Structured Sparsity Reconstruction." In 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2014. http://dx.doi.org/10.1109/cvpr.2014.353.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kummerle, Christian, and Juliane Sigl. "Harmonic Mean Iteratively Reweighted Least Squares for low-rank matrix recovery." In 2017 International Conference on Sampling Theory and Applications (SampTA). IEEE, 2017. http://dx.doi.org/10.1109/sampta.2017.8024466.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kummerle, Christian, and Claudio M. Verdun. "Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares." In 2019 13th International conference on Sampling Theory and Applications (SampTA). IEEE, 2019. http://dx.doi.org/10.1109/sampta45681.2019.9030959.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Park, Young Woong, and Diego Klabjan. "Iteratively Reweighted Least Squares Algorithms for L1-Norm Principal Component Analysis." In 2016 IEEE 16th International Conference on Data Mining (ICDM). IEEE, 2016. http://dx.doi.org/10.1109/icdm.2016.0054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kim, Hongman, Melih Papila, Raphael Haftka, William Mason, Layne Watson, and Bernard Grossman. "Detection and correction of poorly converged optimizations by Iteratively Reweighted Least Squares." In 41st Structures, Structural Dynamics, and Materials Conference and Exhibit. Reston, Virigina: American Institute of Aeronautics and Astronautics, 2000. http://dx.doi.org/10.2514/6.2000-1525.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhou, Xu, Rafael Molina, Fugen Zhou, and Aggelos K. Katsaggelos. "Fast iteratively reweighted least squares for lp regularized image deconvolution and reconstruction." In 2014 IEEE International Conference on Image Processing (ICIP). IEEE, 2014. http://dx.doi.org/10.1109/icip.2014.7025357.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Iteratively reweighted least squares"

1

WOHLBERG, BRENDT, and PAUL RODRIGUEZ. SPARSE REPRESENTATIONS WITH DATA FIDELITY TERM VIA AN ITERATIVELY REWEIGHTED LEAST SQUARES ALGORITHM. Office of Scientific and Technical Information (OSTI), January 2007. http://dx.doi.org/10.2172/1000493.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Daubechies, Ingrid, Ronald DeVore, Massimo Fornasier, and C. S. Gunturk. Iteratively Re-weighted Least Squares Minimization for Sparse Recovery. Fort Belvoir, VA: Defense Technical Information Center, June 2008. http://dx.doi.org/10.21236/ada528510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography