Ensemble Learning Techniques

Original article was published by Charu Makhijani on Artificial Intelligence on Medium


Simple Ensemble Learning methods

Voting and Averaging Based Ensemble methods are very simple and easiest form of ensemble learning. Voting is used for Classification problems and Averaging is used for regression problems.

  1. Averaging

As the name says, in this technique we take an average of all the model predictions.

For example, if we predict house price, and if there are 3 base models predicting house price as 450000, 500000, and 550000. With Averaging we take the average as (450000+500000+550000)/ 3 = 500000 which is the final prediction.

Let’s see this in code:

Now, to justify if ensembling (or average prediction here) does better job than the base models, we will compare the mean absolute error of base models and final model.

Avg MAE: 0.48709255488962744
KNN MAE: 0.5220880505643672
Lasso MAE: 0.7568088178180192
SVR MAE: 0.5015218832952784

Mean Absolute Error of ensembled model is much less than the individual models.