Decision Forests
Decision Forests
Ensemble methods, such as decision forests combine the capabilities of machine learning models. Discover how this approach enhances accuracy and enables profound insights, in the field of artificial intelligence.
This course introduces decision trees and decision forests.
Decision forests are a family of supervised learning machine learning models and algorithms. They provide the following benefits:
- They are easier to configure than neural networks. Decision forests have fewer hyperparameters; furthermore, the hyperparameters in decision forests provide good defaults.
- They natively handle numeric, categorical, and missing features. This means you can write far less preprocessing code than when using a neural network, saving you time and reducing sources for error.
- They often give good results out of the box, are robust to noisy data, and have interpretable properties.
- They infer and train on small datasets (<1M examples) much faster than neural networks.
Decision forests produce great results in machine learning competitions, and are heavily used in many industrial tasks. Decision forests are practical, efficient, and interpretable. You can use decision forests for many supervised learning tasks, including:
- classification
- regression
- ranking
- uplift modeling
Learning Objectives:
- How decision trees work.
- How decision forests work, including random forests and Gradient Boosted Trees.
- How to use decision forests effectively.
- When decision forests perform well, and what their limitations are.
Prerequisites
Duration: 2.5 hours
Level: Advanced
Certification: Yes
Cost: Free
Language: English
Type: Self-Paced
Please note: these courses are provided by external sources, links are not actively managed or regularly updated, content might be moved or unavailable.