gradientboosting.jpg

Improve Your CLASSIFICATION with CART and Gradient Boosting

 

ON-DEMAND: Fill out the form to the right.

**Presentation Date was Wednesday, April 5, 2017

Duration: 55 minutes

Speaker: Charles Harrison, Marketing Statistician, Salford Systems

Cost: Free 


Abstract: In this webinar we'll introduce you to two tree-based machine learning algorithms, CART decision trees and Gradient Boosting. Both of these methods can be used for either regression or classification (i.e. Y = “Application Denied” or “Application Accepted”) and we will focus on classification in this presentation. Gradient boosting often outperforms linear regression, Random Forests, and CART. Boosted trees automatically handle variable selection, variable interactions, nonlinear relationships, outliers, and missing values. 

 

We'll see that CART decision trees are the foundation of gradient boosting and discuss some of the advantages of boosting versus a Random Forest. We will explore the gradient boosting algorithm and discuss the most important modeling parameters like the learning rate, number of terminal nodes, number of trees, loss functions, and more. We will demonstrate using an implementation of gradient boosting (TreeNet® Software) to fit the model and compare the performance to a linear regression model, a CART tree, and a Random Forest.

 

 

Tags: CART, Gradient Boosting, TreeNet, Random Forest, logistic regression, linear regression, machine learning, variable selection, variable interactions, nonlinear relationships, outliers, missing values, loss functions, learning rate, classification, decision tree, decision trees, targeted marketing, segmentation

Please register and we will send you a recording.