Home  | Publications | Tho19

Gradient Boosting in Automatic Machine Learning: Feature Selection and Hyperparameter Optimization

MCML Authors

Abstract

This thesis focuses on automating model selection in AutoML, specifically through gradient boosting techniques like gradient tree and component-wise boosting. It addresses challenges in hyperparameter optimization using Bayesian methods, introduces a new feature selection technique, and proposes an AutoML approach that simplifies the process while maintaining accuracy. Four R packages were developed: mlrMBO for Bayesian optimization, autoxgboost for AutoML, compboost for component-wise boosting, and gamboostLSS for generalized additive models (Shortened.)

phdthesis


Dissertation

LMU München. Apr. 2019

Authors

J. Thomas

Links

DOI

Research Area

 A1 | Statistical Foundations & Explainability

BibTeXKey: Tho19

Back to Top