Source: Deep Learning on Medium
Accuracy in Machine Learning
In a machine learning domain performance is one of the measure things that we want to know how our model is performing. There are many techniques to measure the performance of the model. Today we will discuss Accuracy.
Accuracy is defined as the correctly classified points by a total no of points on the test set.
Accuracy = #correctly classified points / Total no of points in testset
Suppose we have 1000 data points in which 600 are positive and 400 are negative. Our model predicted 580 points positives and 20 negatives for positive data points and for negative it’s predicted 350 points positive and 50 negatives.
Here, 580 points in positive and 350 points in negative are classified correctly. And 20 in positive and 50 points in negative are misclassified
Total no of points correctly classified points =830
Total no of points misclassified = 70
Accuracy = 830/1000 = 83%
Here I have used iris dataset and perform gaussian NB and then let’s see how to measure the accuracy
from sklearn import datasets #Loading datset
iris = datasets.load_iris()#performing Gaussian Naive Bayes
from sklearn.naive_bayes import GaussianNB
y =naive_classifier.fit(iris.data, iris.target).predict(iris.data)
from sklearn.metrics import accuracy_score
score = accuracy_score(iris.target, pr)
Points to Remember:
- Easy to understand
- Always measured on test data only
- Never measure accuracy on Imbalanced dataset
Thanks for reading!!!