GPU is good for parallel computing but the problem is some machine learning libraries don't utilize the GPU, unless that machine learning based on image processing or some sort of graphics processing, what if I am using machine learning for predictive Analytics? do libraries like TensorFlow utilize the GPU? or they use only CPU? or can I choose which processing unit to use? whats the deal here? note: predictive Analysis requires no graphics processing.
The short answer: yes, it will! The slightly longer answer: The computation that happens in the GPU in any of the machine learning frameworks that support GPUs is not limited to graphical processing. For instance, if your model is a simple logistic regression, a framework such as TensorFlow will run it on the GPU if properly configured. The advantage of GPUs for machine learning is that training big neural networks benefits greatly from the high level of parallelism that the GPUs offer. If you want to know more about this, I'd recommend you start here or here. some things to consider: how much a model will benefit from running in the GPU will depend on how much it will benefit from parallel computation in general. Deep Learning models can be applied to predictive analytics, as well as more classical machine learning models. Bear in mind that neural nets are possibly the category of models that will benefit inherently from the GPU (see links above). Even though running models using GPUs (or even more specialised hardware) can bring benefits, I would suggest that you don't choose a framework and, especially, don't choose an algorithm based solely on the fact that it will benefit from parallelism, but rather look at how appropriate a given algorithm is for the data you have.