I love how nearly all of this article went over my head.
Fascinating to read nevertheless, and I personally appreciate companies like Netflix publishing research like this in the open for us all to appreciate.
I'd love to work on this stuff but I'm really not sure where to start. Mathematics have always been a weak spot for me and whenever I start to read a neural networks book my brain is instantly turned to mush as I'm bombarded with mathematical equations that I can't read.
Does anyone know of any good resources that focus on the code rather than the maths? As a predominantly C# programmer, resources in that language would be a bonus.
I can now admit that I feel the same way, but it doesn't stop me reading this type of article. It's fascinating in a "this is really cool, but am I smart enough to understand it?" kind of way.
The other thing that I wondered about was: did the performance improvements come from the huge number of cores in GPUs, the increased memory bandwidth, or something else inherent in GPU computing?
I love how nearly all of this article went over my head.
Fascinating to read nevertheless, and I personally appreciate companies like Netflix publishing research like this in the open for us all to appreciate.
I'd love to work on this stuff but I'm really not sure where to start. Mathematics have always been a weak spot for me and whenever I start to read a neural networks book my brain is instantly turned to mush as I'm bombarded with mathematical equations that I can't read.
Does anyone know of any good resources that focus on the code rather than the maths? As a predominantly C# programmer, resources in that language would be a bonus.