RANDOM FOREST IN THE PARLIAMENT!

Original article was published by Samadrita Ghosh on Artificial Intelligence on Medium


This is it.

This is all that you need to know to have a strong high-level intuition of random forest. This is why I suggest imagining real life instances whenever you jump into problems. Through this simple example, we covered all these concepts in one sitting:

  • Classification problem intuition (Pass or Terminate a Bill)
  • Identifying features (Employment rate, literacy rate, etc.)
  • Bagging (Division of the available data)
  • if-else decision making (Condition-based decisions from every member)
  • Ensemble of classification models (Bill passed or terminated based on majority voting)

But the most important concept that we learned even without mentioning it once was BIAS resulting from unbalanced data.

Similarly, Random Forest can be represented by a ton of other life-like examples which is usually not possible with most of the other classification algorithms with such ease. But you can count on me, because I will still try.

In spite of a series of innovations across the data science community, Random Forest remains one of the most intuitive techniques, which calls for the need of celebrating simplicity at its best!

Keep a lookout on this space for a series of machine learning concepts, delivered to you like you already know it!