Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> All of the theory and such is mostly worthless

No. You will not get beyond copy-paste level without being comfortable with ML foundations. That doesn't mean you need to be able to prove variational inference bounds in your sleep, but you'll want to know why we need things like lower bounds for approximate inference.



>No. You will not get beyond copy-paste level without being comfortable with ML foundations.

but everyone else in here is hyping fastai, which is not just copy-paste but wrapped copy-paste at that (so you're not even learning pytorch).


Sure, go through the fastai material and maybe write a blog post about how you learned ML (read: DL) in a few months. What you really learned is copy-pasting code (as you mentioned) and some neural net tricks (like a good learning-rate to start SGD).

How to learn ML? Do fastai + reading Daphne Koller's and Chris Bishop's books on PGMs + re-implementing a paper on Gaussian process classification + another paper on GNNs + ....


bishop's book is a good suggestion (i prefer hastie) for ml but you have to admit that

1. fastai is neural nets 2. bishop's book (and whoever else's) are grad books that require considerable mathematical training to really profit from 3. the aforementioned books don't teach anything practical!

so ultimately i completely agree with the op of this thread - just jump in and read around when things don't work how you expect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: