Multi-Class Classifier in the browser using Tensorflow.js

Original article was published on Deep Learning on Medium


Multi-Class Classifier in the browser using Tensorflow.js

Most of us are familiar with running those machine learning codes in our jupyter notebooks or Google Colab but obviously to make it presentable we would need to deploy the whole ecosystem for which we would require knowledge of Python web server framework and overall structure of interactive Python data applications. So to overcome all this hardship we are going to use Tensorflow.js which will allow us to develop machine learning models in JavaScript, and use machine learning directly in the browser or Node.js.

In this blog, we’ll take a look at and step through an example of how to read data from a CSV file, and use it to train a multi-class classifier. We will be working with breast cancer data adapted from the Wisconsin dataset.

Things are pretty easy as compared to Tensorflow using python but still, you will need the following things installed before you get started,

  1. Google Chrome
  2. VSCode or Brackets
  3. Web Server for Chrome

Step 1: Create a Simple Web Page

Inside a folder create a file ML_Classifier.html with the following code.

<html>
<head>
</head>
<script lang=”js”></script>
<body>
<h1>ML_Classifier</h1>
</body>
</html>

Step 2: Downloading the data

Download the dataset from this link.

The data has already been split for you into a training and a validation set. If you look at the CSV files, you will notice that the first column corresponds to the ‘diagnosis’. The values in this column correspond to the diagnosis where a value of 1 indicates malignant cancer, and a 0 indicates a benign one.

Step 3: Importing Tensorflow JS

Add a script tag below the head and above the body to load the TensorFlow.js file.

<script src=”https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@latest"></script>

Step 4: Reading the Dataset

const trainingUrl = 'wdbc-train.csv';
const trainingData = tf.data.csv(trainingUrl, {
columnConfigs: {
diagnosis: {
isLabel: true
}
}
}

We will read the data from the wdbc-train.csv file and we will specify the label of the dataset by setting the isLabel to true for ‘diagnosis’.

Repeat the same process for testing data present in ‘wdbc-test.csv’:

const testingUrl = 'wdbc-test.csv';
const testingData = tf.data.csv(testingUrl, {
columnConfigs: {
diagnosis: {
isLabel: true
}
}

}

Step 5: Data Transformation

const convertedTrainingData = trainingData.map(({xs, ys}) => {
return{ xs: Object.values(xs), ys: Object.values(ys)};
}).batch(10);

We will convert the data into arrays. In this case, the labels are integers, not strings. Therefore, there is no need to convert string labels into a one-hot encoded array of label values.

Repeat the same process for testing data:

const convertedTestingData = testingData.map(({xs, ys}) => {
return{ xs: Object.values(xs), ys: Object.values(ys)};
}).batch(10);// YOUR CODE HERE

Step 6: Building the model

First, we would need to specify the number of input features. You can get the number of features from the number of columns and the number of labels in the training data.

const numOfFeatures = (await trainingData.columnNames()).length - 1;

First, we define the model as a tf.sequential then we will create a neural network that predicts 1 if the diagnosis is malignant and 0 if the diagnosis is benign. Our neural net has a hidden layer with 10 nodes with a ReLu activation function and an output layer having a single output unit with a sigmoid activation function.

const model = tf.sequential();
model.add(tf.layers.dense({inputShape: [numOfFeatures], activation: "sigmoid", units: 5}))
model.add(tf.layers.dense({activation: "relu", units: 10}));
model.add(tf.layers.dense({activation: "sigmoid", units: 1}));

Compile the model using the binaryCrossentropy loss, the rmsprop optimizer, and accuracy for your metrics.

model.compile({loss:'binaryCrossentropy',optimizer:tf.train.rmsprop(0.1),metrics:['accuracy']});

Step 7: Training and Testing the model

To do the training, we use model.fitdataset.You can pass the data in as the first parameter as you can see here. Then you pass a list of JSON style name values with things like the epoch specified as 100, the callbacks like this and the converted testing data will act as our validationData. The callbacks specify the list itself and which we specify the behavior on epoch ended. Well, we’ll just log the epoch number, the current loss, and current accuracy.

await model.fitDataset(convertedTrainingData, 
{epochs:100,
validationData: convertedTestingData,
callbacks:{
onEpochEnd: async(epoch, logs) =>{
console.log("Epoch: " + epoch + " Loss: " + logs.loss + " Accuracy: " + logs.acc);
}
}});

Step 8: Putting everything together

Training should be an asynchronous function because it will take an indeterminate time to complete and we don’t want to block the browser while this is going on. So it’s better to do it as an asynchronous function that calls us back when it’s done. You call it and parse it the model that you just created. Then when it calls back, the model is trained.

Here’s the full code with run() as our asynchronous function:

async function run(){

const trainingUrl = 'wdbc-train.csv';
const trainingData = tf.data.csv(trainingUrl, {
columnConfigs: {
diagnosis: {
isLabel: true
}
}
});

const convertedTrainingData = trainingData.map(({xs, ys}) => {
return{ xs: Object.values(xs), ys: Object.values(ys)};
}).batch(10);

const testingUrl = 'wdbc-test.csv';
const testingData = tf.data.csv(testingUrl, {
columnConfigs: {
diagnosis: {
isLabel: true
}
}

});

const convertedTestingData = testingData.map(({xs, ys}) => {
return{ xs: Object.values(xs), ys: Object.values(ys)};
}).batch(10);

const numOfFeatures = (await trainingData.columnNames()).length - 1;

const model = tf.sequential();
model.add(tf.layers.dense({inputShape: [numOfFeatures], activation: "sigmoid", units: 5}))
model.add(tf.layers.dense({activation: "relu", units: 10}));
model.add(tf.layers.dense({activation: "sigmoid", units: 1}));

model.compile({loss: 'binaryCrossentropy',optimizer:tf.train.rmsprop(0.1),metrics:['accuracy']});


await model.fitDataset(convertedTrainingData,
{epochs:100,
validationData: convertedTestingData,
callbacks:{
onEpochEnd: async(epoch, logs) =>{
console.log("Epoch: " + epoch + " Loss: " + logs.loss + " Accuracy: " + logs.acc);
}
}});
}
run();

Step 9: Running the code

  1. Open the Chrome browser, go to this link and click on launch app.

2. Click on “CHOOSE FOLDER” and select the folder that contains the examples or exercises you want to run. For this example, we are going to run the ML_Classifier.html file.

3. Once you have chosen the correct folder, you can click on the Web Server URL (http://127.0.0.1:8887).

4. Once you click on the Web Server URL, this will open a new tab in your Chrome browser. You can now click on the HTML file you want to run. In this case, we are going to run the ML_Classifier.html file.

5. A new blank page will appear. Press Ctrl + Shift + I (Developer Tools) and select the console to see your output.

6. If you can see your accuracy close to 90. Well, congratulations you made your first neural network using javascript.

If you want full code you can check this link for my GitHub account.

Hope you all like it. Do leave your feedback in comments.

Thank you for reading!!!