Original article can be found here (source): Artificial Intelligence on Medium
In the online setting, → the data have to travel between networks → but in the client, there is no backend → like node js.
This is right inside the browser → the device should be very powerful. (data is private → it is running on the internet). (and this might be better since → there are fewer roadblocks).
Client-Side NN → is great.
Recommended to use → models that are 30 MB or fewer → mobile nets → are a good choice but the efficient net is also good.
Training the model → is better to use Keras → each of them has their own use case. (we are going to convert the model into TF JS → Layer API → we are able to do very similar things that are done in Keras).
There is a tool for converting → we need to saved the weights as well as architecture. (to h5 file → or we can download the pre-trained files).
That is just how to do converting → very easy → this method is on the spot → directly save to tfjs → super cool!
Hosting the web application → we need some kind of webserver → and obviously they are going to use node js → it is the right choice.
But we should go a step further → to optimize the page → using next js and export to a static file → which preloads everything. (terminal running → install everything via npm).
And we are hosting the web app → in the static format → very interesting → in this was the case we can even use Nginx. (port listening on 81)
Now we are able to run the web application → all of the ML will be running in tfjs → hence we cannot just use flask implementation.
Converted everything to tfjs model weight files. (they are going to use bootstrap → this education video are actually really great → they are converting a lot of different stuff).
And also uses Chrome Developer Tools.
Get the image data into some 54 string → pass it on to the model → and it will make the prediction → data format is very important.
Might be a better idea to have to model always loaded?
We need to build the preprocessing function ourselves → this is not a good thing lol.
Takes 5 second to do the prediction → this is very bad → we need to pre-process the image.
Damn → quite a lot of preprocessing is done → hope all of these functions are implemented as a library or an add on function. (we are going to look into debuggers → super cool! →
Proper use of stopping the code → and checking the variable → debug console is also there to help the developer out.
This seems to be an effective as well as efficient way of programming and developing. (getting the shape right for tfjs seems to be a very hard thing to do) → most of the debugging is making sure the dimension is correct for certain tensors.
Wow, broadcasting → we are able to reduce the lines of code → since the tensor operations can be done → via compatible dimension.
How operations are done → broadcasting → very impressed that this is covered in tfjs. (the tensor operations are very powerful).
Now the user is able to select the model to use → very coolly!