SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Investing in Exponential Growth

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
From: Paul H. Christiansen9/17/2018 2:43:42 PM
   of 1084
 
NVIDIA TAKES ON THE INFERENCE HORDES WITH TURING GPUS

When it comes to machine learning, a lot of the attention in the past six years has focused on the training of neural networks and how the GPU accelerator radically improved the accuracy of networks, thanks to its large memory bandwidth and parallel compute capacity relative to CPUs. Now, the nascent machine learning industry – and it most surely is an industry – is becoming more balanced and has an increasing focus on inference, the act of using those trained models on new data to do something useful with it.

Nvidia is no stranger to inference, of course, even if it has created the dominant platform on which machine learning training is done today. If you train a neural network, you have to run new data against it so there is always inference going on.

Read More – The Next Platform
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext