Originally posted by coder
View Post
While it is true that GPU is used for training, inference tasks can also be computed by CPU, which is much cheaper than renting cloud machines with GPUs.
Originally posted by coder
View Post
It will be even better if we can have both.
Originally posted by coder
View Post
But not everybody has enough financial resource for that, special accelerators are still quite expensive, so I believe there are still a lot of cases where the inference is run on CPU.
Comment