Academic literature on the topic 'Gradient boosting'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gradient boosting.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Gradient boosting"
Friedman, Jerome H. "Stochastic gradient boosting." Computational Statistics & Data Analysis 38, no. 4 (February 2002): 367–78. http://dx.doi.org/10.1016/s0167-9473(01)00065-2.
Full textBiau, G., B. Cadre, and L. Rouvière. "Accelerated gradient boosting." Machine Learning 108, no. 6 (February 4, 2019): 971–92. http://dx.doi.org/10.1007/s10994-019-05787-1.
Full textDombry, Clément, and Jean-Jil Duchamps. "Infinitesimal gradient boosting." Stochastic Processes and their Applications 170 (April 2024): 104310. http://dx.doi.org/10.1016/j.spa.2024.104310.
Full textDahlia, Rizka, and Cucu Ika Agustyaningrum. "Perbandingan Gradient Boosting dan Light Gradient Boosting Dalam Melakukan Klasifikasi Rumah Sewa." Jurnal Nasional Komputasi dan Teknologi Informasi (JNKTI) 5, no. 6 (December 29, 2022): 1016–20. http://dx.doi.org/10.32672/jnkti.v5i6.5460.
Full textDubossarsky, E., J. H. Friedman, J. T. Ormerod, and M. P. Wand. "Wavelet-based gradient boosting." Statistics and Computing 26, no. 1-2 (May 8, 2014): 93–105. http://dx.doi.org/10.1007/s11222-014-9474-0.
Full textLu, Haihao, and Rahul Mazumder. "Randomized Gradient Boosting Machine." SIAM Journal on Optimization 30, no. 4 (January 2020): 2780–808. http://dx.doi.org/10.1137/18m1223277.
Full textSuryana, Silvia Elsa, Budi Warsito, and Suparti Suparti. "PENERAPAN GRADIENT BOOSTING DENGAN HYPEROPT UNTUK MEMPREDIKSI KEBERHASILAN TELEMARKETING BANK." Jurnal Gaussian 10, no. 4 (December 31, 2021): 617–23. http://dx.doi.org/10.14710/j.gauss.v10i4.31335.
Full textZhu, Fei, Xiangping Wu, Yijun Lu, and Jiandong Huang. "Strength Estimation and Feature Interaction of Carbon Nanotubes-Modified Concrete Using Artificial Intelligence-Based Boosting Ensembles." Buildings 14, no. 1 (January 4, 2024): 134. http://dx.doi.org/10.3390/buildings14010134.
Full textЧЕВЕРЕВА, С. А., and Е. А. КАЗАНКОВ. "COMPARATIVE ANALYSIS OF ALGORITHMS BASED ON GRADIENT BOOSTING IN THE FIELD OF SECURITY OF THE INTERNET OF THINGS." Экономика и предпринимательство, no. 7(168) (August 6, 2024): 836–39. http://dx.doi.org/10.34925/eip.2024.168.7.164.
Full textKriuchkova, Anastasiia, Varvara Toloknova, and Svitlana Drin. "Predictive model for a product without history using LightGBM. Pricing model for a new product." Mohyla Mathematical Journal 6 (April 18, 2024): 6–13. http://dx.doi.org/10.18523/2617-7080620236-13.
Full textDissertations / Theses on the topic "Gradient boosting"
Werner, Tino [Verfasser], Peter [Akademischer Betreuer] Ruckdeschel, and Matthias [Akademischer Betreuer] Schmid. "Gradient-Free Gradient Boosting / Tino Werner ; Peter Ruckdeschel, Matthias Schmid." Oldenburg : BIS der Universität Oldenburg, 2020. http://d-nb.info/120419968X/34.
Full textMoreni, Matilde <1994>. "Prediction of Cryptocurrency prices using Gradient Boosting machine." Master's Degree Thesis, Università Ca' Foscari Venezia, 2020. http://hdl.handle.net/10579/17739.
Full textMayr, Andreas [Verfasser]. "Boosting beyond the mean - extending component-wise gradient boosting algorithms to multiple dimensions / Andreas Mayr." München : Verlag Dr. Hut, 2013. http://d-nb.info/104287848X/34.
Full textAhlgren, Marcus. "Claims Reserving using Gradient Boosting and Generalized Linear Models." Thesis, KTH, Matematisk statistik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229406.
Full textEn av de centrala verksamheterna ett försäkringsbolag arbetar med handlar om att uppskatta skadekostnader för att kunna ersätta försäkringstagarna. Denna procedur kallas reservsättning och utförs av aktuarier med hjälp av statistiska metoder. Under de senaste årtiondena har statistiska inlärningsmetoder blivit mer och mer populära tack vare deras förmåga att hitta komplexa mönster i alla typer av data. Dock har intresset för dessa varit relativt lågt inom försäkringsbranschen till förmån för mer traditionella försäkringsmatematiska metoder. I den här masteruppsatsen undersöker vi förmågan att reservsätta med metoden \textit{gradient boosting}, en icke-parametrisk statistisk inlärningsmetod som har visat sig fungera mycket väl inom en rad andra områden vilket har gjort metoden mycket populär. Vi jämför denna metod med generaliserade linjära modeller(GLM) som är en av de vanliga metoderna vid reservsättning. Vi jämför modellerna med hjälp av ett dataset tillhandahålls av Länsförsäkringar AB. Modellerna implementerades med R. 80\% av detta dataset används för att träna modellerna och resterande 20\% används för att evaluera modellernas prediktionsförmåga på okänd data. Resultaten visar att GLM har ett lägre prediktionsfel. Gradient boosting kräver att ett antal hyperparametrar justeras manuellt för att få en välfungerande modell medan GLM inte kräver lika mycket korrigeringar varför den är mer praktiskt lämpad. Fördelen med att kunna modellerna komplexa förhållanden i data utnyttjas inte till fullo i denna uppsats då vi endast arbetar med sex prediktionsvariabler. Det är sannolikt att gradient boosting skulle ge bättre resultat med mer komplicerade datastrukturer.
Kriegler, Brian. "Cost-sensitive stochastic gradient boosting within a quantitative regression framework." Diss., Restricted to subscribing institutions, 2007. http://proquest.umi.com/pqdweb?did=1383476771&sid=1&Fmt=2&clientId=1564&RQT=309&VName=PQD.
Full textSabo, Juraj. "Gradient Boosting Machine and Artificial Neural Networks in R and H2O." Master's thesis, Vysoká škola ekonomická v Praze, 2016. http://www.nusl.cz/ntk/nusl-264614.
Full textSöderholm, Matilda. "Predicting Risk of Delays in Postal Deliveries with Neural Networks and Gradient Boosting Machines." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-169478.
Full textNikolaou, Nikolaos. "Cost-sensitive boosting : a unified approach." Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/costsensitive-boosting-a-unified-approach(ae9bb7bd-743e-40b8-b50f-eb59461d9d36).html.
Full textMayrink, Victor Teixeira de Melo. "Avaliação do algoritmo Gradient Boosting em aplicações de previsão de carga elétrica a curto prazo." Universidade Federal de Juiz de Fora (UFJF), 2016. https://repositorio.ufjf.br/jspui/handle/ufjf/3563.
Full textApproved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2017-03-07T15:06:57Z (GMT) No. of bitstreams: 1 victorteixeirademelomayrink.pdf: 2587774 bytes, checksum: 1319cc37a15480796050b618b4d7e5f7 (MD5)
Made available in DSpace on 2017-03-07T15:06:57Z (GMT). No. of bitstreams: 1 victorteixeirademelomayrink.pdf: 2587774 bytes, checksum: 1319cc37a15480796050b618b4d7e5f7 (MD5) Previous issue date: 2016-08-31
FAPEMIG - Fundação de Amparo à Pesquisa do Estado de Minas Gerais
O armazenamento de energia elétrica em larga escala ainda não é viável devido a restrições técnicas e econômicas. Portanto, toda energia consumida deve ser produzida instantaneamente; não é possível armazenar o excesso de produção, ou tampouco cobrir eventuais faltas de oferta com estoques de segurança, mesmo que por um curto período de tempo. Consequentemente, um dos principais desafios do planejamento energético consiste em realizar previsões acuradas para as demandas futuras. Neste trabalho, apresentamos um modelo de previsão para o consumo de energia elétrica a curto prazo. A metodologia utilizada compreende a construção de um comitê de previsão, por meio da aplicação do algoritmo Gradient Boosting em combinação com modelos de árvores de decisão e a técnica de amortecimento exponencial. Esta estratégia compreende um método de aprendizado supervisionado que ajusta o modelo de previsão com base em dados históricos do consumo de energia, das temperaturas registradas e de variáveis de calendário. Os modelos propostos foram testados em duas bases de dados distintas e demonstraram um ótimo desempenho quando comparados com resultados publicados em outros trabalhos recentes.
The storage of electrical energy is still not feasible on a large scale due to technical and economic issues. Therefore, all energy to be consumed must be produced instantly; it is not possible to store the production leftover, or either to cover any supply shortages with safety stocks, even for a short period of time. Thus, one of the main challenges of energy planning consists in computing accurate forecasts for the future demand. In this paper, we present a model for short-term load forecasting. The methodology consists in composing a prediction comitee by applying the Gradient Boosting algorithm in combination with decision tree models and the exponential smoothing technique. This strategy comprises a supervised learning method that adjusts the forecasting model based on historical energy consumption data, the recorded temperatures and calendar variables. The proposed models were tested in two di erent datasets and showed a good performance when compared with results published in recent papers.
Sjöblom, Niklas. "Evolutionary algorithms in statistical learning : Automating the optimization procedure." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-160118.
Full textScania har länge jobbat med statistik men har på senare år investerat i att bli ett mer datadrivet företag och använder nu data science i nästan alla avdelningar på företaget. De algoritmer som utvecklas av data scientists måste optimeras för att kunna utnyttjas till fullo och detta är traditionellt sett en manuell och tidskrävade process. Detta examensarbete utreder om och hur väl evolutionära algoritmer kan användas för att automatisera optimeringsprocessen. Utvärderingen gjordes genom att implementera och analysera fyra varianter avgenetiska algoritmer med olika grader av komplexitet och trimningsparameterar. Algoritmen som var målet för optimering var XGBoost, som är en gradient boosted trädbaserad modell. Denna applicerades på data som tidigare hade modellerats i entävling. Resultatet visar att evolutionära algoritmer är applicerbara i att hitta bra modellermen påvisar även hur fundamentalt det är att arbeta med databearbetning innan modellering.
Books on the topic "Gradient boosting"
Jome, Kaitlyn. Gradient Boosting Trees : a Beginner's Guide for Gradient Boosting: Decision Tree Machine Learning Projects. Independently Published, 2021.
Find full textNylen, Jonas. Gradient Boosting Trees 101 : How Do Decision Trees Work: Boosting. Independently Published, 2021.
Find full textVerhagen, Efren. Gradient Boosting Trees 101 : How Do Decision Trees Work: Gradient Boosted Trees. Independently Published, 2021.
Find full textXgboost. the Extreme Gradient Boosting for Mining Applications. GRIN Verlag GmbH, 2018.
Find full textHands-On Gradient Boosting with XGBoost and Scikit-learn: Perform Accessible Machine Learning and Extreme Gradient Boosting with Python. Packt Publishing, Limited, 2020.
Find full textBhave, Roshan. Practical Machine Learning with LightGBM and Python: Explore Microsofts Gradient Boosting Framework to Optimize Machine Learning. Packt Publishing, Limited, 2021.
Find full textDecision Making Handbook: Career Readiness, Bank Receptionist, Vector Concept, Crypto Wallets, Forrester Wave,TR6 Internet Entrepreneur, Bvnk, Financial Position, Gradient Boosting. Independently Published, 2022.
Find full textBook chapters on the topic "Gradient boosting"
Truong, Dothang. "Gradient Boosting." In Data Science and Machine Learning for Non-Programmers, 479–504. Boca Raton: Chapman and Hall/CRC, 2024. http://dx.doi.org/10.1201/9781003162872-18.
Full textAyyadevara, V. Kishore. "Gradient Boosting Machine." In Pro Machine Learning Algorithms, 117–34. Berkeley, CA: Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3564-5_6.
Full textGreenwell, Brandon M. "Gradient boosting machines." In Tree-Based Methods for Statistical Learning in R, 309–58. Boca Raton: Chapman and Hall/CRC, 2022. http://dx.doi.org/10.1201/9781003089032-8.
Full textBiau, Gérard, and Benoît Cadre. "Optimization by Gradient Boosting." In Advances in Contemporary Statistics and Econometrics, 23–44. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-73249-3_2.
Full textEmami, Seyedsaman, Carlos Ruiz Pastor, and Gonzalo Martínez-Muñoz. "Multi-Task Gradient Boosting." In Lecture Notes in Computer Science, 97–107. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-40725-3_9.
Full textDenuit, Michel, Donatien Hainaut, and Julien Trufin. "Gradient Boosting with Neural Networks." In Springer Actuarial, 167–92. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-25827-6_7.
Full textKnieper, Lars, Thomas Kneib, and Elisabeth Bergherr. "Spatial Confounding in Gradient Boosting." In Contributions to Statistics, 88–94. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-65723-8_14.
Full textNziyumva, Eric, Rong Hu, Chih-Yu Hsu, and Jovial Niyogisubizo. "Electrical Load Forecasting Using Hybrid of Extreme Gradient Boosting and Light Gradient Boosting Machine." In Lecture Notes in Electrical Engineering, 1083–93. Singapore: Springer Singapore, 2022. http://dx.doi.org/10.1007/978-981-16-6963-7_95.
Full textWan, Lunjun, Ke Tang, and Rui Wang. "Gradient Boosting-Based Negative Correlation Learning." In Intelligent Data Engineering and Automated Learning – IDEAL 2013, 358–65. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-41278-3_44.
Full textTahir, Muhammad. "Brain MRI Classification Using Gradient Boosting." In Machine Learning in Clinical Neuroimaging and Radiogenomics in Neuro-oncology, 294–301. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-66843-3_29.
Full textConference papers on the topic "Gradient boosting"
Wang, Hongfei, Chenliang Luo, Deqing Zou, Hai Jin, and Wenjie Cai. "Gradient Boosting-Accelerated Evolution for Multiple-Fault Diagnosis." In 2024 Design, Automation & Test in Europe Conference & Exhibition (DATE), 1–6. IEEE, 2024. http://dx.doi.org/10.23919/date58400.2024.10546656.
Full textDiego, Ferran, and Fred A. Hamprecht. "Structured Regression Gradient Boosting." In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2016. http://dx.doi.org/10.1109/cvpr.2016.162.
Full textCheng, Chen, Fen Xia, Tong Zhang, Irwin King, and Michael R. Lyu. "Gradient boosting factorization machines." In the 8th ACM Conference. New York, New York, USA: ACM Press, 2014. http://dx.doi.org/10.1145/2645710.2645730.
Full textTimmons, Caitlin, Andrea Boskovic, Sreeharsha Lakamsani, Walter Gerych, Luke Buquicchio, and Elke Rundensteiner. "Positive Unlabeled Gradient Boosting." In 2020 IEEE MIT Undergraduate Research Technology Conference (URTC). IEEE, 2020. http://dx.doi.org/10.1109/urtc51696.2020.9668901.
Full textGrari, Vincent, Boris Ruf, Sylvain Lamprier, and Marcin Detyniecki. "Fair Adversarial Gradient Tree Boosting." In 2019 IEEE International Conference on Data Mining (ICDM). IEEE, 2019. http://dx.doi.org/10.1109/icdm.2019.00124.
Full textEnkhtaivan, Batnyam, and Isamu Teranishi. "pGBF: Personalized Gradient Boosting Forest." In 2023 International Joint Conference on Neural Networks (IJCNN). IEEE, 2023. http://dx.doi.org/10.1109/ijcnn54540.2023.10191289.
Full textA, Ashwini, and Loganathan V. "Nutrigrow Using Gradient Boosting Regressor." In 2024 1st International Conference on Trends in Engineering Systems and Technologies (ICTEST). IEEE, 2024. http://dx.doi.org/10.1109/ictest60614.2024.10576157.
Full textIbragimov, Bulat, and Anton Vakhrushev. "Uplift Modelling via Gradient Boosting." In KDD '24: The 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 1177–87. New York, NY, USA: ACM, 2024. http://dx.doi.org/10.1145/3637528.3672019.
Full textLiu, Yang, Zhuo Ma, Ximeng Liu, Siqi Ma, Surya Nepal, Robert H. Deng, and Kui Ren. "Boosting Privately: Federated Extreme Gradient Boosting for Mobile Crowdsensing." In 2020 IEEE 40th International Conference on Distributed Computing Systems (ICDCS). IEEE, 2020. http://dx.doi.org/10.1109/icdcs47774.2020.00017.
Full textYing, Bicheng, and Ali H. Sayed. "Diffusion gradient boosting for networked learning." In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2017. http://dx.doi.org/10.1109/icassp.2017.7952609.
Full textReports on the topic "Gradient boosting"
Rossi, Jose Luiz, Carlos Piccioni, Marina Rossi, and Daniel Cuajeiro. Brazilian Exchange Rate Forecasting in High Frequency. Inter-American Development Bank, September 2022. http://dx.doi.org/10.18235/0004488.
Full textForteza, Nicolás, and Sandra García-Uribe. A Score Function to Prioritize Editing in Household Survey Data: A Machine Learning Approach. Madrid: Banco de España, October 2023. http://dx.doi.org/10.53479/34613.
Full textLiu, Hongrui, and Rahul Ramachandra Shetty. Analytical Models for Traffic Congestion and Accident Analysis. Mineta Transportation Institute, November 2021. http://dx.doi.org/10.31979/mti.2021.2102.
Full textJääskeläinen, Emmihenna. Construction of reliable albedo time series. Finnish Meteorological Institute, September 2023. http://dx.doi.org/10.35614/isbn.9789523361782.
Full text