Source: Deep Learning on Medium
What is the best Workstation for Deep learning?
If you don’t already have a GPU that you can use for deep learning (a recent , high- end NVIDIA (GPU), then running deep-learning experiments in the cloud is a simple , low-cost way for you to get started without having to buy any additional hardware. If you’re using Jupyter notebooks, the experience of running in the cloud is no different from locally.
But if you’re heavy user of deep learning, this setup isn’t sustainable in the long term or even for more than a few weeks. Server GPU’s are expensive, ranging from $.50 to $.90 per hour. Meanwhile a solid consumer GPU will cost you somewhere between $500 to $1500.
So the thing is if you are serious user of Deep Learning, and want to run every code you type then consider building your own workstation. If you don’t have that budget it’s fine you can just code and run everything in the Server , just keep in mind that when you are done with implementation of the code just remember to turn off the server or disable. Because you don’t want to get unwanted expensive bills just doing nothing