Articles on: Algorithms

XGBoost

XGBoost (eXtreme Gradient Boosting) is a gradient boosting algorithm that can be used for supervised learning tasks such as classification and regression. The main idea behind XGBoost is to fit an ensemble of decision trees to the training data.

The ensemble of decision trees is created by iteratively adding new trees to the model, where each tree is trained to correct the mistakes of the previous trees. The final output is a combination of the predictions from all of the trees in the ensemble.





Each decision tree in the ensemble is trained using a variant of the gradient descent algorithm called gradient tree boosting. The gradient tree boosting algorithm is used to minimize the following objective function:

Objective Function = L(y, y_pred) + Ω(f)

where L(y, y_pred) is the loss function, y is the true label, y_pred is the predicted label, f is the ensemble of decision trees and Ω(f) is the regularization term that is added to prevent overfitting.

The objective function is optimized by updating the parameters of each tree in the ensemble in a way that reduces the loss function. This is done by computing the gradient of the objective function with respect to the parameters of each tree, and then taking a step in the direction that decreases the objective function.

XGBoost proved itself to generally combine both fastest and most accurate solutions among ensemble models. It is frequently used in machine learning competitions, research, and the industry.

Updated on: 27/01/2023

Was this article helpful?

Share your feedback

Cancel

Thank you!