HackerNews Readings
40,000 HackerNews book recommendations identified using NLP and deep learning

Scroll down for comments...

Prev Page 4/180 Next
Sorted by relevance

laichzeit0onJan 7, 2020

There’s already a term you can use. “Statistical learning”. There’s even a well known important book with that title: Elements of Statistical Learning.

beccaaf1229onSep 2, 2020

This book is awesome! How does this compare to Introduction to Statistical Learning or Elements of Statistical Learning? Other than the addition of code?

markovblingonDec 26, 2016

You should start with "Introduction to Statistical Learning" which is the baby brother of "Elements of Statistics Learning" (arguably THE reference book) - it's easy to follow and has examples in R, a functional language.

Plus, it's free!

http://www-bcf.usc.edu/~gareth/ISL/

larrydagonAug 12, 2017

Two good ebooks. Go well with R.

Introduction to Statistical Learning http://www-bcf.usc.edu/~gareth/ISL/

Elements of Statistical Learning https://web.stanford.edu/~hastie/ElemStatLearn/

snotrocketsonJan 25, 2014

"The Elements of Statistical Learning" is great. It assumes an astute reader, but if you've made it through some post-graduate level work, you'd be fine.

exgonAug 22, 2012

The Elements of Statistical Learning, by T. Hastie, R. Tibshirani and J. Friedman [1] is also a very good one. Plus, the book is freely available on the authors' website.

[1] http://www-stat.stanford.edu/~tibs/ElemStatLearn/

glimcatonAug 27, 2011

The Elements of Statistical Learning is pretty good. It's also available for free.

http://www-stat.stanford.edu/~tibs/ElemStatLearn/

teruakohatuonMay 5, 2020

The Elements of Statistical Learning and Introduction to Statistical Learning are THE textbooks for an introduction to statistical methods of data science. They are free and very high quality. Most of my class didn't buy it, but many including myself did.

jaf656sonNov 21, 2009

fyi, The Elements of Statistical Learning can be found here as a free pdf

http://www-stat.stanford.edu/~tibs/ElemStatLearn/

pmboumanonFeb 12, 2009

Try "The Elements of Statistical Learning" by Hastie, Tibshirani and Friedman. Lots of math but an outstanding introduction.

carbocationonDec 7, 2014

Tibshirani and Hastie are two of my favorite authors covering machine learning. Hastie even makes Elements of Statistical Learning available for free as a PDF on his website [1].

1 = (10+ MB PDF warning) https://web.stanford.edu/~hastie/local.ftp/Springer/ESLII_pr...

caffeineonJune 19, 2009

Don't forget Hastie & Tibshirani's Elements of Statistical Learning! :)

EugeleoonJune 15, 2020

Any recommendations (in whatever area), apart from Elements of Statistical Learning and Linear Algebra Done Right? And are their Probability and All of Statistics books any good?

deltuxonApr 18, 2017

The Elements of Statistical Learning, by Hastie, Tibshirani and Friedman, for everything on Machine Learning and Statistics. Available for free online:
https://statweb.stanford.edu/~tibs/ElemStatLearn/

kraphtonOct 15, 2016

My favorite textbook: Elements of Statistical Learning by Hastie. It's free, too!

If you don't understand something in the book, back up and learn the pre-reqs as needed.

http://web.stanford.edu/~hastie/ElemStatLearn/printings/ESLI...

ktharavaadonFeb 2, 2009

Its a good read but its kind of heavy on theorems and math with lack of application / algorithms and background, which makes it dry to read.

To serve the field of statistical learning justice, "The elements of statistical learning" is an excellent book on the subject.

mendezaonAug 28, 2017

What about the Pattern Recognition book by Bishop? I am reading it now and its more approachable than the Elements of Statistical Learning book

bhickeyonJuly 3, 2011

For applications to machine learning, I strongly recommend Elements of Statistical Learning. The text is eye-opening. A PDF is freely available on Rob Tibshirani's website: http://www-stat.stanford.edu/~tibs/ElemStatLearn/

sn9onJune 10, 2017

Introduction to Statistical Learning is free and quite good: http://www-bcf.usc.edu/~gareth/ISL/

Follow it up with Elements of Statistical Learning by three of the same authors for more advanced stuff.

dengonApr 28, 2019

Since the site mentions "An Introduction to Statistical Learning":

The first book on statistical learning by Hastie, Tibshirani and Friedman, which is absolutely terrific, is freely available for download:

The Elements of Statistical Learning

http://web.stanford.edu/~hastie/ElemStatLearn/

ivanechonJan 10, 2020

An interesting note: Trevor Hastie is an author on this paper. The crowd around here probably knows him best for books he co-wrote: The Elements of Statistical Learning (2001) and An Introduction to Statistical Learning (2013).

curiousgalonJuly 27, 2018

The Elements of Statistical Learning

or the more beginner friendly

An Introduction to Statistical Learning: With Applications in R

vector_spacesonJune 15, 2020

If I'm not mistaken Elements of Statistical Learning and all other books by the authors are already available for free on their website

https://web.stanford.edu/~hastie/pub.htm

jimbokunonDec 16, 2009

Skimming the preface, this book appears to be something like the approach btilly describes. Thanks for the link. Intend to download to my iPod Touch to compete for my rare moments of free time with Elements of Statistical Learning, and the other free Math and CS books I'm sure will be on there soon.

BootvisonMar 6, 2017

I haven't taken a lot of data science classes but I'm not sure that's true. If you start with linear regression the mean squared error would make more sense. I actually searched through "The Elements of Statistical Learning" and the word 'recall' is not used in this sense at all.

disgruntledphd2onNov 24, 2020

So I guess the authors of the Elements of Statistical Learning aren't "real" researchers then?

For reference, the authors of that book (the best book about ML in general) were all involved in the development of S and R.

knnonMar 16, 2016

AI by Russell and Norvig. Machine learning by Murphy, Elements of Statistical Learning by Hastie et al. Just a few good ones out of many!

simonflynnonDec 18, 2014

If anyone is interested in getting into algo trading, other books not mentioned but are highly regarded are:

- "The Evaluation and Optimization of Trading Strategies" by Pardo

- "The Elements of Statistical Learning" by Hastie et al

snikolovonMay 13, 2010

Check out
http://www.autonlab.org/tutorials/

and also a free ebook called Elements of Statistical Learning
http://www-stat.stanford.edu/~tibs/ElemStatLearn/

I've also found a number of course note sets helpful, for example, MIT's machine learning course

http://ocw.mit.edu/OcwWeb/Electrical-Engineering-and-Compute...

dclonJune 10, 2017

I can thoroughly recommend Elements of Statistical Learning.

It won't teach you much about theoretical statistics, or even things like experiment design, but you will learn a LOT about regression, classification and model fitting which is what everyone seems to want to be able to do these days.

madenineonNov 22, 2019

Goodfellow’s book on deep learning[0] is a good starter - the first chapters give a solid overview of ML theory as well. Elements of Statistical Learning is another.

[0]http://www.deeplearningbook.org/

cs702onJune 6, 2015

These slide tutorials are excellent: engaging and friendly but still rigorous enough that they can be used as reference materials. They're a great companion to "Introduction to Statistical Learning" and "The Elements of Statistical Learning" by Hastie, Tibshirani, et al. The author of these tutorials is Andrew Moore, Dean of the School of Computer Science at Carnegie Mellon.

aheilbutonNov 19, 2011

Most of the textbooks on machine learning present things in that way (it's really the only way). For example, check out Elements of Statistical Learning (http://www-stat.stanford.edu/~tibs/ElemStatLearn/).

warrenmaronMar 26, 2013

I would also recommend the Coursera course on Machine Learning by Andrew Ng and Probabilistic Graphical Models by Daphne Koller.
I would also go over some basics probability and statistics review. Maybe some linear algebra too.
Python is a great language to do data analysis in. I recommend the scikit-learn and pandas packages and using ipython notebooks.
Another book is the Elements of Statistical Learning (http://www-stat.stanford.edu/~tibs/ElemStatLearn/).
There are also Kaggle contests for testing your chops.

mauritsonMay 22, 2014

The most popular choices seem to be:

Machine Learning: a Probabilistic Perspective, by Murphy

http://www.cs.ubc.ca/~murphyk/MLbook/

Pattern classification, by Duda et all

http://www.amazon.com/Pattern-Classification-Pt-1-Richard-Du...

The Elements of Statistical Learning, by Hastie et all. It is free from Stanford.

http://www-stat.stanford.edu/~tibs/ElemStatLearn

Mining of Massive Datasets, free from Stanford.

http://infolab.stanford.edu/~ullman/mmds.html

Bayesian Reasoning and Machine Learning, by Barber, free available online.

http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=...

Learning from data, by Abu-Mostafa.

It comes with Caltech video lectures: http://work.caltech.edu/telecourse.html

Pattern Recognition and Machine Learning, by Bischop

http://research.microsoft.com/en-us/um/people/cmbishop/prml/

Also noteworthy

Information Theory, Inference, and Learning Algorithms, by Mackay, free.

http://www.inference.phy.cam.ac.uk/itprnn/book.html

Classification, Parameter Estimation and State Estimation, by van der Heijden.

http://prtools.org

Computer Vision: Models, Learning, and Inference, by Prince, available for free

http://www.computervisionmodels.com/

Probabilistic Graphical Models, by Koller. Has an accompanying course on Coursera.

brentonJune 5, 2008

I am not the 'parent', but as I said elsewhere I would recommend The Elements of Statistical Learning. It gets quite a bit deeper than PCI, but I'm fairly confident you could learn almost everything in PCI from ESL.

It doesn't come with pre-canned python, but honestly almost everything (in terms of code) in PCI is available somewhere on the web and/or already built into python libraries, matlab, and/or R.

lymitshnonDec 31, 2019

AFAIK FastAI courses are well recommended for their Deep Learning stuff but they also have ML course[0]
Another usual recommendation is Elements of Statistical Learning book. Another option is finding a MOOC that you enjoy and following it.

[0]http://course18.fast.ai/ml

arbitrage314onNov 18, 2015

I'm a math geek, but I'm also a mostly self-taught data scientist.

"The Elements of Statistical Learning" (https://web.stanford.edu/~hastie/local.ftp/Springer/OLD/ESLI...) is far and away the best book I've seen.

It took me hundreds of hours to get through it, but if you're looking to understand things at a pretty deep level, I'd say it's well-worth it.

Even if you stop at chapter 3, you'll still know more than most people, and you'll have a great foundation.

Hope this helps!

mellingonDec 31, 2020

I’m up to Chapter 6 in ISLR

https://github.com/melling/ISLR

Would Elements of Statistical Learning be my next book?

I’ve seen the Bishop book highly recommended too, and it has been mentioned in this post.

https://www.amazon.com/Pattern-Recognition-Learning-Informat...

grayclhnonJuly 21, 2014

Two free books that I haven't seen mentioned, that are from more of a stats perspective

* James, Witten, Hastie, and Tibshirani's An Introduction to Statistical Learning, with Applications in R

http://www-bcf.usc.edu/~gareth/ISL/

* Hastie, Tibshirani, and Freedman's Elements of statistical learning (more advanced)

http://statweb.stanford.edu/~tibs/ElemStatLearn/

cedonMar 27, 2012

I agree that Rusell and Norvig AI doesn't have much penetration yet. As for Elements of Statistical Learning...
That's the canonical textbook for ML. If industry relies on splines, boosting, and support vector machines, then it is really not that far from modern academic ML research.

pallandtonNov 10, 2013

Depending on how they structured their deal, it may very well be possible that Springer knows about this and is allowing it. There are quite a few other Springer books available on authors' websites, for example The Elements of Statistical Learning (http://www-stat.stanford.edu/~tibs/ElemStatLearn/)

curiousgalonSep 22, 2016

I've actually been reading The Elements of Statistical Learning (The French version at least) and I've found it extremely interesting.

Thank you Tom!

markovblingonOct 16, 2016

Machine learning is subset of statistics.

The standard text in ML, "The Elements of Statistical Learning" is authored by statistics Professors.

Statistics is the new statistics. The rest is marketing bullshit.

cs702onJan 12, 2016

The lecturers here, Hastie and Tibshirani, are also the authors of the classic text book, "Introduction to Statistical Learning," probably the best introduction to machine/statistical learning I have ever read.[1]

I highly recommend the book and this online course, both of which are FREE.

Hastie and Tibshirani's other book, "The Elements of Statistical Learning," is also excellent but far more theoretical, and best for experienced practicioners who want to use it as a reference guide.[2]

--

[1] http://www-bcf.usc.edu/~gareth/ISL/ISLR%20Sixth%20Printing.p...
[2] http://statweb.stanford.edu/~tibs/ElemStatLearn/

ant_szonMay 18, 2014

I think the book rank 2 in this article (The Elements of Statistical Learning) is too mathematical.

huaconJuly 21, 2015

If you're serious about the math, read "The Elements of Statistical Learning' instead. Same guys, just as much R code, but harder.

http://statweb.stanford.edu/~tibs/ElemStatLearn/

256onJan 17, 2018

I don't know much about AI but for ML specifically Elements of Statistical Learning is fantastic. I find its explanations a lot easier to understand than other resources. I recommend you skim through it to get a taste. Additionally if you prefer lectures ETHZ has recordings of their ML class[1].

The best way to learn the details is of course to read the original papers. This is especially true for following along with the latest developments in deep learning.

[1] https://www.ethz.ch/content/vp/en/lectures/d-infk/2017/autum...

kgwgkonJuly 6, 2018

I think Efron & Hastie may be a bit too advanced and terse for the OP. They cover many things in not so many pages and "our intention was to maintain a technical level of discussion appropriate to Masters’-level statisticians or first-year PhD students."

But given that they distribute the PDF for free it's worth checking out. Hastie, Tibshirani & Friedman's The Elements of Statistical Learning and the watered-down and more practical Introduction to Statistical Learning are also nice. All of them can be downloaded from https://web.stanford.edu/~hastie/pub.htm

perturbationonOct 7, 2019

I'd recommend Elements of Statistical Learning or ISLR instead, if you want to start with a theory-heavy introduction. Most of what you need for DS you'd I think better learn through projects or on-the-job.

Also, as others have mentioned, some of the most important skills for DS are data munging, data "presentation", and soft skills like managing expectations / relationships / etc.

I would not recommend this book if you want to get into DS with the idea that, "I'll read this and then I'll know everything I need to." It's too dense and academically-focused, and it would probably be discouraging if you try to read this all without getting your feet wet.

xkgtonJune 15, 2020

I just lost my morning skimming through this book. It looks like there are a lot of other good books among this collection.

It won't be possible to go through all, but are there are any recommendations from HN? Personally, I have come across Elements of Statistical Learning[1], Recommender Systems[2], The Algorithm Design Manual[3] in many recommended lists.

1. http://link.springer.com/openurl?genre=book&isbn=978-0-387-8...

2. http://link.springer.com/openurl?genre=book&isbn=978-3-319-2...

3. http://link.springer.com/openurl?genre=book&isbn=978-1-84800...

craigchingonJuly 17, 2015

"Introdocution to Statistical Learning" by Trevor Hastie et al. [1] They have a free online class through Stanford [2] Sign in to their system and you can take the archived version for free.

ISL is an excellent, free book, introducing you to ML, you can go deeper, but, to me this is where I wish I'd started. I am taking the Data Science track at Coursera (on Practical Machine Learning now) and I am kicking myself that I didn't start with ISL instead.

Now, I know you specifically asked about Python, but the concepts are bigger than the implementation. All of these techniques are available in Python's ML stack, scikit-learn, NumPy, pandas, etc. I don't know of the equivalent of ISL for Python, but if you learn the concepts and you're a programmer of any worth, you will be able to move from R to Python. Maybe take/read ISL, but do the labs in Python, that might be a fun way to go.

Lastly, to go along with ISL, "Elements of Statistical Learning" also by Hastie et al is available for free to dive deeper [3]

[1] -- http://www-bcf.usc.edu/~gareth/ISL/

[2] -- https://lagunita.stanford.edu/courses/HumanitiesandScience/S...

[3] -- http://statweb.stanford.edu/~tibs/ElemStatLearn/

disgruntledphd2onMar 19, 2016

If you know almost nothing about the field, then introduction to statistical learning is a good choice.

http://www-bcf.usc.edu/~gareth/ISL/ISLR%20First%20Printing.p...

It assumes some understanding of calculus, but doesn't require matrix algebra.

The original (and amazing) book that lots of people used is Elements of Statistical Learning.

https://web.stanford.edu/~hastie/local.ftp/Springer/OLD/ESLI...

Chapters 1-7 are worth their weight in gold. This is one of the cases where the physical books are much better, as you'll need to flick back and forth to see the figures (which are one of the best parts).

The forgoing assumes that you already know some statistics/data analysis (the latter probably being more important).

If you haven't done this before, then I suggest that you acquire some data you care about, install R (a good book is the Art of R Programming by Matloff), and start trying to make inferences. And draw graphs. Many, many, many graphs.

If you keep at this, finding papers/books and reading theory, and implementing it in your spare time, then you can probably get a good data science job in 1-2 years. You'll probably need to devote much of your free time to it though.

I'm assuming that you can already code, given the context :)

nilknonJune 22, 2017

Once you take calculus 3, I'd recommend diving right into the Deep Learning book: http://www.deeplearningbook.org/

It's definitely the best reference on the subject. With only calculus 3 under your belt the math won't be trivial, but it should overall be fairly approachable and certainly much more so than something like "The Elements of Statistical Learning".

psv1onDec 31, 2019

Good free resources:

- MIT: Big Picture of Calculus

- Harvard: Stats 110

- MIT: Matrix Methods in Data Analysis, Signal Processing, and Machine Learning

If any of these seem too difficult - Khan Academy Precalculus (they also have Linear Algebra and Calculus material).

This gives you a math foundation. Some books more specific to ML:

- Foundations of Data Science - Blum et al.

- Elements of Statistical Learning - Hastie et al. The simpler version of this book - Introduction to Statistical Learning - also has a free companion course on Stanford's website.

- Machine Learning: A Probabilistic Perspective - Murphy

That's a lot of material to cover. And at some point you should start experimenting and building things yourself of course. If you'are already familiar with Python, the Data Science Handbook (Jake Vanderplas) is a good guide through the ecosystem of libraries that you would commonly use.

Things I don't recommend - Fast.ai, Goodfellow's Deep Learning Book, Bishop's Pattern Recognition and ML book, Andrew Ng's ML course, Coursera, Udacity, Udemy, Kaggle.

weavieonSep 9, 2015

The Elements of Statistical Learning together with the online course (http://www.r-bloggers.com/in-depth-introduction-to-machine-l...) makes for a great introduction.

EDIT: Oops I should have said "An Introduction to Statistical Learning with Applications in R" rather than The Elements of Statistical Learning. The Elements book goes into way too much depth to be a good introduction to the subject.

brentonMay 24, 2008

I second videolectures as an excellent source. Just listen to the ones you are interested in! One tip I'd offer is to make sure you understand the math of each lecture before moving on. Skip over the maths enough and you'll find you haven't truly learned much.

However, in terms of books I would add Elements of Statistical Learning (Hastie, Tibshirani, and Friedman). It is an excellent text that covers a lot of ground. The down side of this of course is that it is written at the graduate level, so be prepared.

kuusistoonSep 12, 2018

I got my start by getting a PhD, but that's perhaps not a practical recommendation. In reality though, you might say I started learning ML by reading Mitchell in class:
https://www.cs.cmu.edu/afs/cs.cmu.edu/user/mitchell/ftp/mlbo...
It's dated, but it's quite approachable and does a great job explaining a lot of the fundamentals.

If you want to approach machine learning from a more statistical perspective, you could also have a look at An Introduction to Statistical Learning to start:
http://www-bcf.usc.edu/~gareth/ISL/
Or if you're more mathematically inclined than the average bear, you could jump directly into The Elements of Statistical Learning:
https://web.stanford.edu/~hastie/ElemStatLearn/

If you want something a little more interactive than a book though, you might have a look at Google's free crash course on machine learning:
https://developers.google.com/machine-learning/crash-course/...
I checked it out briefly maybe six months ago, and it seemed pretty good. It seemed a bit focused on Tensor Flow and some other tools, but that's okay.

indigentmartianonJan 21, 2017

Andrew Ng's Coursera course simply titled "Machine Learning" is good - it addresses the mathematics of fundamental algorithms and concepts while giving practical examples and applications: https://www.coursera.org/learn/machine-learning

Regarding books, there are many very high quality textbooks available (legitimately) for free online:

Introduction to Statistical Learning (James et al., 2014) http://www-bcf.usc.edu/~gareth/ISL/

the above book shares some authors with the denser and more in-depth/advanced

The Elements of Statistical Learning (Hastie et al., 2009) http://statweb.stanford.edu/~tibs/ElemStatLearn/

Information Theory: Inference & Learning Algorithms (MacKay, 2003) http://www.inference.phy.cam.ac.uk/itila/p0.html

Bayesian Reasoning & Machine Learning (Barber, 2012) http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=...

Deep Learning (Goodfellowet al., 2016) http://www.deeplearningbook.org/

Reinforcement Learning: An Introduction (Sutton & Barto, 1998) http://webdocs.cs.ualberta.ca/~sutton/book/ebook/the-book.ht...

^^ the above books are used on many graduate courses in machine learning and are varied in their approach and readability, but go deep into the fundamentals and theory of machine learning. Most contain primers on the relevant maths, too, so you can either use these to brush up on what you already know or as a starting point look for more relevant maths materials.

If you want more practical books/courses, more machine-learning focussed data science books can be helpful. For trying out what you've learned, Kaggle is great for providing data sets and problems.

blahionJuly 13, 2016

Regression Modeling Strategies. You need at least some notion of what regressions and probability are all about, but if you have the basics covered, this book will take you through 80% of the journey and the rest is some googling to figure out some concepts that might be murky.

This is a book that emphasizes practical applications without getting bent on the math details too much. If on the other hand you are a math whiz, Elements of Statistical Learning is THE book but it expects you to be very proficient in math.

Both books are seriously underrated, which is kind of funny to say because you will find only praises about them, but they deserve even more.

disgruntledphd2onDec 23, 2011

Foundations of Statistical NLP is awesome.

Having some background in statistics, but none in either linguistics or NLP, that book was a revelation. If you read, and implemented all the exercises in that book you'd find a way to make millions, as NLP is a big deal right now. I did find a little too much concentration on the low level stuff (character parsing, bag of words etc), but in conjunction with Elements of Statistical Learning its wonderful.

cs702onJune 29, 2018

If you have at least some coding experience and you are interested in the practical aspects of ML/DL (i.e., you want to learn the how-to, not the why or the whence), my recommendation is to start with the fast.ai courses by Jeremy Howard (co-author of this "Matrix Calculus" cheat sheet) and Rachel Thomas[a]:

* fast.ai ML course: http://forums.fast.ai/t/another-treat-early-access-to-intro-...

* fast.ai DL course: part 1: http://course.fast.ai/ part 2: http://course.fast.ai/part2.html

The fast.ai courses spend very little time on theory, and you can follow the videos at your own pace.

Books:

* The best books on ML (excluding DL), in my view, are "An Introduction to Statistical Learning" by James, Witten, Hastie and Tibshirani, and "The Elements of Statistical Learning" by Hastie, Tibshirani and Friedman. The Elements arguably belongs on every ML practitioner's bookshelf -- it's a fantastic reference manual.[b]

* The only book on DL that I'm aware of is "Deep Learning," by Goodfellow, Bengio and Courville. It's a good book, but I suggest holding off on reading it until you've had a chance to experiment with a range of deep learning models. Otherwise, you will get very little useful out of it.[c]

Good luck!

[a] Scroll down on this page for their bios: http://course.fast.ai/about.html

[b] Introduction to Statistical Learning: http://www-bcf.usc.edu/~gareth/ISL/ The Elements of Statistical Learning: https://web.stanford.edu/~hastie/ElemStatLearn/

[c] http://www.deeplearningbook.org/

maciejgrykaonDec 23, 2011

I'd recommend the following two (both free online):

The Elements of Statistical Learning:
http://www-stat.stanford.edu/~tibs/ElemStatLearn/

Second, while focused on computer vision, has great intro to probability and learning:

Computer Vision: Models, Learning, and Inference
http://computervisionmodels.com/

texthompsononSep 20, 2015

I'm a big fan of two books:

* The Elements of Statistical Learning, by Hastie, Tibshirani and Friedman (https://web.stanford.edu/~hastie/local.ftp/Springer/OLD/ESLI...).
* Probability Theory: The Logic of Science, by ET Jaynes (http://bayes.wustl.edu/etj/prob/book.pdf)

Best of luck. I can see from your post that you're thinking about performance tuning, I'm assuming you mean of software. That's a nice area - the nice part is that compared to fields like medical genetics, data on performance of software is relatively cheap to get, so a lot of issues about small sample sizes are surmountable.

eachroonDec 31, 2019

I think it depends on what you want to focus on. If you want to do deep learning, fast.ai is probably the best resource available. Jeremy Howard and Rachel Thomas (the two founders) have poured quite a lot into fostering a positive, supportive community around fast.ai which really does add quite a lot of value.

If you want to really understand the fundamentals of machine learning (deep learning is just one subset of ML!), there is no substitute for picking up one of the classic texts like: Elements of Statistical Learning (https://web.stanford.edu/~hastie/ElemStatLearn/), Machine Learning: A Probabalistic Approach (https://www.cs.ubc.ca/~murphyk/MLbook/) and going through it slowly.

I'd recommend a two pronged approach: dig into fast.ai while reading a chapter a week (or at w/e pace matches your schedule) of w/e ML textbook you end up choosing. Despite all of the hype of deep learning, you really can do some pretty sweet things (ex: classify images/text) with neural nets within a day or two of getting started. Machine learning is a broad field, and you'll find that you will never know as much as you think you should, and that's okay. The most important thing is to stick to a schedule and be consistent with your learning. Good luck on this journey :)

disgruntledphdonJune 12, 2011

If you have already brushed up on linear algebra and calculus (which you're gonna need if you want to do any serious ML) take a look at Hastie et al's Elements of Statistical Learning.
The PDF is free, and the book is both extremely well written and super comprehensive.
http://www-stat.stanford.edu/~tibs/ElemStatLearn/

You might also want to check out R, as its an amazing statistics language which has hundreds of packages available for ML. There's a large user community, and the really obscure error messages you get will teach you a lot about statistics. http://cran.r-project.org/

Also, a lot of machine learning is getting the data into a usable form, so learn how to use Unix command line tools such as sed, awk, grep et al. They are absolute lifesavers.

fnbronNov 4, 2017

If you're looking to understand the underlying theory behind deep learning, the Deep Learning book by Goodfellow et al. is awesome.

If you're interested in general machine learning, the Elements of Statistical Learning, by Tibsihirani et. al is great; a more applied book is Applied Statistical Learning by the same author.
For a more applied view, I'd check out Tensorflow or PyTorch tutorials; there's no good book, as far as I'm aware, because the tech changes so quickly.

I've done a series of videos on how to do deep learning that might be useful; if you're interested, there's a link in my profile.

dclonApr 6, 2017

I recommend Elements of Statistical Learning, free @ https://statweb.stanford.edu/~tibs/ElemStatLearn/

This book covers everything from simple regression and classification from the statistical side to things like gradient boosted decision trees and the like on the ML side with enough math to make sure you understand what's actually going on.

I should note, it doesn't touch deep learning, which is what I suspect most people interested in 'machine learning' without any background in stats want to learn about these days.

phren0logyonNov 21, 2009

If you are starting from scratch without a very strong math background, I'd recommend:

1.) Head First Statistics -- Pretty good, but beware the section on Bayes Theorem which is a bit off. This is a quick and casual intro, but worthwhile. I used it to refresh me on my college stats course (which was a long time ago), and I like it. There's also Head First Data Analysis, which I haven't read but could be a reasonable companion. HF Data Analysis uses Excel and R.

2.) Using R for Introductory Statistics (Verzani) -- Good explanations and exercises, and you will also learn R. This second point is actually pretty important, because it's a very valuable tool. Whereas the Head First Stats book walks though pretty simple problems that you work out in pencil, the Verzani book has many real-world data sets to explore that would be impractical to do by hand. That said, I think it's valuable to work things out in pencil with the first book before you move on to this one.

After these books, Elements of Statistical Learning seems to be the current favorite.

dkarlonNov 3, 2010

Making Our Democracy Work, by Justice Stephen Breyer

The Koran (Just five minutes here and there. Honestly, I find it excruciatingly boring.)

The Elements of Statistical Learning (Just started.)

Plus fun stuff. I have a Simenon on my bedside table plus The Tenant of Wildfell Hall. Not sure which is next.

stonemetalonMay 8, 2015

Machine learning is applied statistics. Elements of Statistical Learning was written by profs at Stanford and is free on the internet. It might be a bit hard since they don't really teach stats in high school. O'Reilly has a few ml books. I liked Programming Collective Intelligence it is beginner friendly and would probably help you come up with project ideas.

dafrdmanonAug 31, 2020

I agree though I saw that as outside the scope of this book. I tried to be clear in the introduction that the book is a "user manual" of sorts that simply shows how to construct models, rather than how to decide between them, what the benefits of each are, etc. That information is certainly important but I felt it had been covered more than adequately by books like Elements of Statistical Learning

amrrsonAug 28, 2017

Statistics and Probability - For non-math background, Openintro.org with R and Sas lab is a good one. Khan academy videos on the same again makes a lot of concepts easier.

http://www.r-bloggers.com/in-depth-introduction-to-machine-l...
Introduction to Statistical Learning http://www-bcf.usc.edu/~gareth/ISL/ (Rob S and by Trevor H, Free I guess) for more in depth, Elements of Statistical Learning by the same.

Linear Algebra (Andrew Ng's this part in Introduction to Machine Learning is a short and crisp one)

If you're not scared by Derivatives, you can check them. But you can easily survive and even excel as a data scientist or ML practitioner with these.

kitanataonMay 18, 2018

This right here is a perfect example of why this position is gatekeeped so much. Having a PhD doesn’t automatically mean you think like a scientist of a mathematician, and having a bachelor’s degree and 15 years of experience writing code and then putting in the effort to read Elements of Statistical Learning, compeleting several online courses in machine learning and data science, studying probability, combinatorics, graph theory, does not mean you don’t have a scientific mindset. A scientific mindset can be learned. You are not special because you have a PhD and I don’t. The only difference between you and me is that I had the ability to learn for free what you paid for. But when you go “oh hah, he’s just a coder and only has a bachelors degree” and you won’t even call me to talk to me that means you are missing out, and you are gatekeeping. You are not special and you are not smarter than me. You’ve just read a book I haven’t and wrote a white paper. I can read that book too. I can write a paper too.

I can do everything you can do.

vowellessonDec 28, 2019

* Elements of Statistical Learning - Hastie, Tibshisrani

* (Lot's of machine learning books to list: PRML, All of Stats, Deep Learning, etc.)

* Active Portfolio Management - Kahn, Grinold

* Thinking, fast and slow - Kahneman

* Protein Power (the Eades') / Why we get fat (Taubes)

* Why we sleep (Walker)

* Deep Work / So Good They Can't Ignore You (Newport)

* Flowers for Algernon (Keyes)

* Getting to Yes (Fisher)

alexcnwyonAug 20, 2017

I find it curious that machine learning is a branch of computer science given how much of it is actually statistical learning theory.

The go-to graduate level machine learning text "The Elements of Statistical Learning" was written by 2 statistics professors (one, Prof. Hastie, a fellow South African! :)

Granted, neural networks are often taught as a bolt-on lecture in statistics course machine learning modules but topics like regularization and a rigorous study of overfitting were born in stats departments.

jll29onJuly 24, 2020

You want to read either Elements of Statistical Learning (2nd ed.) OR Kevin Murphy's ML book for the theory.

Then you will want to consult a text book in your work domain (e.g. introduction to speech & language OR statistical natural language processing for the domain of natural language processing).

And finally, you will want either a book, or free online Web resources/tutorial videos that show you how to do things in practice, given a particular programming language and tool-set (e.g. Python + TensorFlow, Java + DeepLearing4J).

This recipe of Theory + Application + Practice/Tools should get you there.

EugeleoonJune 29, 2020

I, for one, enjoyed your little foray into the field of statistics. So much so, that I'd love to learn some of this stuff as well! I'm a bioinformatics student, so I have a rigorous math background (analysis, linear algebra), but for some reason, our course is quite light on statistics and probability.

What resource would you recommend to get an intuitive grasp of statistics?

To give you an idea about what kind of resource (book) I'm looking for: I'm currently reading Elements of Statistical Learning and I enjoy that it has all the mathematical rigour I need to really understand why all of it works, but also that it's heavy on commentary and pictures, which helps me to understand the math quicker. Counterexamples: Baby Rudin one one side of the spectrum, The Hundred-Page Machine Learning Book on the other.

pjmorrisonDec 31, 2019

There's a MOOC that uses 'Introduction to Statistical Learning' by the authors of 'Elements of Statistical Learning', here: https://lagunita.stanford.edu/courses/HumanitiesSciences/Sta...

mustafafonFeb 4, 2011

If you really want to learn the fundamental underpinnings of machine learning, you will need a strong background in probability and stochastic processes. I would suggest Python (or MATLAB if you can get access to it) to learn how different methods works. That way you can separate mathematical issues from programming issues. As far as courses go, you should be looking for courses in Liner Algebra, Numerical Computation/Optimization (Convex, Nonlinear), Statistical Inference, Stochastic Processes.

Good References:
1) Elements of Statistical Learning - Hastie, Tibshirani and Friedman

2) Pattern Classification - Duda, Hart and Stork

3) Pattern Recognition - Theoridis, Koutroumbas

4) Machine Learning - Tom Mitchell

5) http://videolectures.net/Top/Computer_Science/Machine_Learni...

tom_bonSep 22, 2016

Hi,

I found Larry Harris' Trading and Exchanges: Market Microstructure for Practitioners a solid introduction to market making and trading. Terms and concepts are easy to pick up from the text. I was comfortable enough after reading it to skim stats journal papers talking about market making models. The Stockfighter team had mentioned it in older threads here. It's expensive, but I just borrowed it from the library at my university instead of buying.

I also like The Elements of Statistical Learning which is free from the authors (http://statweb.stanford.edu/~tibs/ElemStatLearn/download.htm...). Although it isn't specifically about economics or markets, you should at least read it.

I'm at a loss on general economics books.

jamiionNov 10, 2010

> How did you train your search engine?

I used some pretty simple techniques: one bayesian filter to filter jobs from other posts in mailing lists etc, one bayesian filter to score jobs based on keywords. Both filters were trained by feedback from the console ui. The main problem is excluding site-specific keywords that distort the scoring (eg if a site with mostly crappy jobs includes its own name in the listing then even the good jobs will score low by association). A lot of job sites have manky markup so I also had a different scraping script for each site to extract text. All in all its only a couple of hours work. I've been thinking recently about extending it and adding a simple web ui, since finding freelance work is pretty time consuming.

> What sort of collaborative filtering techniques?

I didn't have any specific in mind but there are plenty of good machine learning books that cover different tecniques. If you don't already have a background in maths then 'Programming Collective Intelligence' is a good book to start with. 'The Elements of Statistical Learning' goes into a lot more detail but requires some basic maths.

> Would love to chat further via email.

Email is in my profile.

quantoonSep 3, 2017

Great. Thanks for the recommendation.
What I mean by mathematical background is at or above undergraduate level (so definitely covers calculus, linear algebra, intermediate statistics). A background that can read Elements of Statistical Learning (ESL) comfortably.

What I found is that many "data science" books cover how to use R or Pandas at a very introductory level. Books like ESL focus on core theories (which is great) but do not focus on how to tackle a tough real-world data.

I suppose much of data insight come from experience, but I was wondering whether there are sources to help me jump start.

misframeronMay 8, 2015

An Introduction to Statistical Learning [0] is also good. It's a little less technical than The Elements of Statistical Learning. We used it for our statistical learning course at my university. The full PDF is available for free as well [1].

[0] http://www-bcf.usc.edu/~gareth/ISL/

[1] http://www-bcf.usc.edu/~gareth/ISL/ISLR%20Fourth%20Printing....

jlgrayonMay 15, 2016

If you have mastered the basics (e.g. Norvig's AIMA, Hastie and Tibshirani's Elements of Statistical Learning, Koller's PGM), then I would suggest that the only place to really get a view of the state of the art is by reading papers.

In general, scientific books are an overview of a field, which can only occur with sufficient time for hindsight and synthesis. Even a thousand page book such as Koller's PGM will be littered with references and suggestions of papers to read for a deeper understanding.

One partial exception might be the Deep Learning book by Goodfellow and Bengio, which was made public only a month or so ago. Even this, however, is just an overview. http://www.deeplearningbook.org/

mb7733onJune 15, 2020

I can second a recommendation of the Elements of Statistical Learning. It's well presented, self contained and features good exercises. It may also be slightly different in presentation from most books on ML because the authors are all professors of statistics, instead of CS. (But, it is NOT a classical statistics textbook.)

monk_the_dogonNov 5, 2011

I'm enrolled in the online Applied ML class from Stanford, and I've also been watching this course from CMU (I'm up to the Graphical Model 4 lecture - almost the midterm). If you've taken at least one stats class you'll get much more out of CMU's class.

BTW, here are some good online resources for machine learning:

* The Elements of Statistical Learning (free pdf book): http://www-stat.stanford.edu/~tibs/ElemStatLearn/

* Information Theory, Inference, and Learning Algorithms (free pdf book): http://www.inference.phy.cam.ac.uk/mackay/itila/

* Videos from Autumn School 2006: Machine Learning over Text and Images: http://videolectures.net/mlas06_pittsburgh/

* Bonus link. An Empirical Comparison of Supervised Learning Algorithms (pdf paper): http://www.cs.cornell.edu/~caruana/ctp/ct.papers/caruana.icm... (Note the top 3 are tree ensembles, then SVM, ANN, KNN. Yes, I know there is no 'best' classifier.)

alexanderchronMar 1, 2020

I can really recommend Introduction to statistical learning by James, Witten, Hastie and Tibshirani if you are looking for something that covers the theory without going into too much detail.

There is also Elements of statistical learning by the same authors if you are looking for something more rigorous. I haven’t read very much of it but it is supposed to very good too.

jamessbonApr 25, 2016

There are various cheat-sheets/decision trees to suggest which machine learning method e.g.

http://www.saedsayad.com/data_mining_map.htm

http://peekaboo-vision.blogspot.co.uk/2013/01/machine-learni...

https://azure.microsoft.com/en-us/documentation/articles/mac...

However, if you want to really understand how things fit together you're probably best reading one of the standard intro textbooks: Murphy's Machine Learning, Bishop's Pattern recognition and machine learning, Hastie et al's The Elements of Statistical Learning, or Wasserman's All of statistics.

Or Barber's textbook, which is freely available online and has some nice mind-maps/concept-maps/trees at the start of each section: http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=...

hrokronSep 12, 2020

I agree you'll definitely want to read Elements of Statistical Learning but there are a few more, namely Think Stats and Think Bayes.

Since no one has really said much about Bayes yet, I think it worth mentioning just how useful it is in DS and ML. A Bayesian approach makes a very good baseline and often one that is hard to beat.

If you're not particular fluent with Probability and Statistics now, let me suggest you add in Khan Academy (make sure to pick the CLEP version) and JBstatistics. Khan has the advantage of quizzes (so you're not just kidding yourself that you know the material). JBstatistics has the advantage of really good explanations. You'll probably want to watch Khan at x1.5 speed.

laxativesonDec 24, 2015

No you're right, I read/skimmed it cover to cover, but that doesn't do that book justice. I didn't do a single exercise and I really focused on the sections that were relevant to what I was doing (Random Forests, testing/validation). The author recommends maybe 4 or 5 chapters as required reading in the forward so I read those much more thoroughly, but again, skipped all of the exercises and the appendices. At the end of the day though, I'm not going to be implementing these algorithms, I'm just using existing libraries so I couldn't really justify the year+ of effort to go in depth on everything. That book is also a summarization of a lot of techniques. If you wanted more depth, I wouldn't think Elements of Statistical Learning is the book for it. I'm hoping to get fill in some of those gaps through reading Optimization Models a bit more closely.

jochenleidneronAug 28, 2017

1. You can get a long way with high school calculus and probability theory.

2. Regarding books I second the late David McKay's "Information Theory, Inference and Learning Algorithms" and the second edition of "Elements of Statistical Learning" by Tibshirani et al. (there's also a more accessible version of a subset of the material targeting MBA students called James et al., An Introduction to Statistical Learning). Duda/Hart/Stork's Pattern Classification (2nd ed.) is also great.
The self-published volume by Abu-Mostafa/Magdon-Ismail/Lin, Learning from Data: A Short Course is impressive, short and useful for self-study.

3. Wikipedia is surprisingly good at providing help, and so is Stack Exchange, which has a statistics sub-forum, and of course there are many online MOOC courses on statistics/probability and more specialized ones on machine learning.

4. After that you will want to consult conference papers and online tutorials on particular models (k-means, Ward/HAC, HMM, SVM, perceptron, MLP, linear and logistic regression, kNN, multinomial naive Bayes, ...).

EvbnonNov 25, 2012

Read Andrew Ng's lecture notes (really more book-level quality) for the version of cs291 he wrote before he created the simplified/less-mathematical corsera version. They are floating around online.

Also, Elements of Statistical Learning is available online for free (previous edition, maybe?) , which covers a lot more standard/traditional statistical curve/surface fitting topics as well, all with high mathematical rigor.

PandabobonMar 20, 2015

I've often seen "An introduction to statistical learning" [1] and "Elements of statistical learning" [2] cited as good resources for statistical inference and machine learning. The former is more of an undergrad text, while the latter seems to be aimed at graduate students. Both books are available free online.

[1]: http://www-bcf.usc.edu/~gareth/ISL/
[2]: http://statweb.stanford.edu/~tibs/ElemStatLearn/

mtzetonJuly 14, 2017

As a rookie trying to get into the field myself, I think there are quite a few ways to start about it.

The programming part with R, python, julia etc., seems to get the most attention here. I think the most important part here is to learn how to load datasets into your system of choice and work with them to get some nice plots out. The book "R for data science"[1] seems like a good intro for this with R and tidyverse.

Somewhat more overlooked here, are the statistical models. I second the recommendation of "Introduction to Statistical Learning"[2], possibly supplemented with it's big brother "Elements of Statistical Learning"[3] if you're more mathematically inclined and want more details. I like their emphasis on starting with simple models and working your way up. I also found their discussion on how to go from data to a mathematical model very lucid.

[1] http://r4ds.had.co.nz/

[2] http://www-bcf.usc.edu/~gareth/ISL/

[3] http://web.stanford.edu/~hastie/ElemStatLearn/

j7akeonJune 30, 2017

Elements of statistical learning book

spectramaxonApr 28, 2019

Geron's book is more of a tutorial/cookbook coalesced with important insights into the practice of machine learning. So, I recommend reading Introduction to Statistical Learning (and Elements of Statistical Learning for theoretical background) before jumping into Geron's book. As engineers, I agree we need to have some theoretical background but at the same time, we are applying this knowledge to real world problems. Geron's book is invaluable and I hope publishes more, it is a gem.

esfandiaonApr 28, 2019

- Wasserman has a book called "All of statistics" that gives a lot of the background required to understand modern machine learning

- Hastie is a co-author of two machine learning books, one is "Elements of Statistical Learning" which is very comprehensive, and "Introduction to Statistical Learning", which is more approachable by people without too much background in stats.

tasubotadasonFeb 6, 2020

I've taken the MOOC version of this course and it was very poorly explained. I hope the content and lecturing has changed since 2013 because the content itself is really interesting.

I wasn't impressed with the quality of the book as well. I did learn quite a few methods there (minhash) that I got to use later so thanks for that, but compared to MLPR, Learning from Data, or TESL books the quality of the former pales.

YadionOct 1, 2018

In machine learning, hands down these are some of the best related textbooks:

- [0] Pattern Recognition and Machine Learning (Information
Science and Statistics)

and also:

- [1] The Elements of Statistical Learning

- [2] Reinforcement Learning: An Introduction by Barto and Sutton

- [3] The Deep Learning by Aaron Courville, Ian Goodfellow, and Yoshua Bengio

- [4] Neural Network Methods for Natural Language Processing (Synthesis Lectures on Human Language Technologies) by Yoav Goldberg

Then some math tid-bits:

[5] Introduction to Linear Algebra by Strang

-----------
links:

- [0] [PDF](http://users.isr.ist.utl.pt/~wurmd/Livros/school/Bishop%20-%...)

- [0][AMZ](https://www.amazon.com/Pattern-Recognition-Learning-Informat...)

- [2] [amz](https://www.amazon.com/Reinforcement-Learning-Introduction-A...)

- [2] [site](https://www.deeplearningbook.org/)

- [3] [amz](https://www.amazon.com/Deep-Learning-Adaptive-Computation-Ma...)

- [3] [pdf](http://incompleteideas.net/book/bookdraft2017nov5.pdf)

- [4] [amz](https://www.amazon.com/Language-Processing-Synthesis-Lecture...)

- [5] [amz](https://www.amazon.com/Introduction-Linear-Algebra-Gilbert-S...)

rm999onSep 15, 2013

For a beginner to machine learning I'd recommend Andrew Ng's course notes and lectures over any textbook I've seen. But I prefer his Stanford CS 229 notes to Coursera for exactly the reasons you state: they are watered down. After you really can understand Andrew Ng's course notes I'd recommend a textbook because they go in more detail and cover more topics. My two favorites for general statistical machine learning are:

* Pattern Recognition and Machine Learning by Christopher M. Bishop

* The Elements of Statistical Learning by Trevor Hastie, Robert Tibshirani and Jerome Friedman

Both are very intensive, perhaps to a fault. But they are good references and are good to at least skim through after you have baseline machine learning knowledge. At this stage you should be able to read almost any machine learning paper and actually understand it.

fractionalhareonSep 13, 2020

Lots of tech and finance companies (particularly those with standardized interview processes) will blacklist questions if they're found online. Those companies will constantly check GitHub, GeeksForGeeks and Leetcode to see if their questions are listed there with solutions.

This probably won't be the case for a question as basic as, "what is regression?" But for any intermediate to advanced interview question involving regression, I would expect companies to jealously guard it.

If you're earnestly interested in building and testing your knowledge, I would recommend you read The Elements of Statistical Learning and Data Analysis Using Regression and Multilevel/Hierarchical Models. Also a good upper undergrad textbook in probability, like A First Course in Probability.

joshvmonDec 31, 2019

Bear in mind Elements of Statistical Learning is a grad-level text. I would never recommend that to a beginner to the field over an Introduction to Statistical Inference, by the same authors.

Geron Aurelien's Oreilly book is great - Hands-On Machine Learning with Scikit-Learn and TensorFlow. Get the second edition which covers Tensorflow 2.

hashr8064onNov 22, 2018

I was just like you. Here's what I did.

1. Picked up a High School Algebra Book. Read from beginning to end and did all exercises.

2. Repeat #1 for Algebra 2, Statistics, Geometry and Calculus. (Really helpful for learning those topics fast was Khan Academy).

3. Did MIT Opencourseware's Calculus and Linear Algebra Courses w/the books and exercises.

Now, this took me about 2 years maybe you can get it done quicker, you're at a level where you can pretty much pick up any book, I think I picked up Elements of Statistical Learning, and actually start parsing and understanding what the formulas mean.

One thing I always do is tear apart formulas and equations and play with little parts of them to see how the parts interact with one another and behave under specific conditions, this has really helped my understanding of all kinds of concepts from Pearson's R to Softmax.

darawkonJune 26, 2021

Attempting to self study statistics and econometrics in a serious way. I just finished Casella and Berger's "Statistical Inference". I'm about halfway through Shumway and Stoffer's "Time Series Analysis", but then next on the list, in the order I tend to read them:

  - Theory of Point Estimation Casella/Lehmann

- Bayesian Data Analysis 3rd ed.

- Convex Optimization - Boyd

- Econometrics - Hayashi

- New Introduction to Multiple Time Series Analysis

- Elements of Statistical Learning

- Machine Learning: A probabilistic perspective

There's a fair amount of overlap between these books, so it's not quite as much as it seems. But i'm hoping to make it through at least a chapter a week this year, which should get me most of the way through them. We'll see how it goes.

n00b101onOct 16, 2016

Machine learning is subset of statistics.
The standard text in ML, "The Elements of Statistical Learning" is authored by statistics Professors.

To be fair, I think "Machine Learning" was an academic marketing term coined by Computer Science departments. It seems that the term "Statistical Learning" was coined in response by Statistics departments. Other similar marketing buzzwords used by various factions (comp sci, stats, actuarial science, industrial engineers, etc) include "Data Science, "Predictive Analytics," "Data Mining," "Knowledge Discovery," "Knowledge Engineering," "Soft Computing," "Artificial Intelligence," "Big Data," "Deep Learning"... To be honest, it's tiresome and troubling to see academic departments invent and adopt overlapping and vacuous marketing buzzwords.

psb217onNov 25, 2012

Elements of Statistical Learning is Hastie's book. Between it and Bishop's book (i.e. Pattern Recognition and Machine Learning), I prefer Hastie's for clarity of exposition. In particular, I find that ESL better conveys the sort of intuitive understanding of _why_ a method works that facilitates practical applications and extensions to novel contexts. Though, there are topics for which the increased equations/explanations ratio of Bishop's book is useful.

I've TAed my university's graduate ML course for the past couple of years, so I've read most chapters of these books in some detail and have hands-on experience using them to help people who are looking closely at these topics for the first time. Interestingly, SVMs are actually a good example of when I'd suggest both books.

achompasonJuly 7, 2018

Elements of Statistical Learning is the other text I came in here to recommend.

One of my most valuable activities in grad school was printing and studying each chapter of EoSL.

It's a comprehensive text on the fundamentals of statistics and machine learning, a solid foundation for the cutting-edge techniques relying on deep learning and reinforcement learning.

kblarsen4onFeb 13, 2015

It is also worth mentioning that Bayesian classifiers, like Naive Bayes, are different from the type of Bayesian regression models described in this post.

Naive Bayes, for example, is more of a "machine learning" technique where the goal is to classify people into groups based on features. Naive Bayes is called Naive because it assumes that all regressors (x_j) are independent given the target variable (let's call it y and assume it is binary). In other words, the conditional log odds of y=1 given the x_j variables is equal to the sum of the log density ratios, where the log density ratio for variable x_j is ln(f(x_j|y=1)/f(x_j|y=0)).

On the other hand, in the price elasticity example described in post we want to infuse outside knowledge into the model because we don't believe what it says on its own. This is a situation where interpretation and believability is an important part of the objective function because we will be running future pricing scenarios from the model.

If you are building, say, a churn model to predict who is going to cancel their accounts, you probably wouldn't infuse your model with outside knowledge since cross validation accuracy is your main goal. You might regularize your model, however, which can be done in a number of ways (Bayesian or non-Bayesian). But in a pricing model or media mix model, and many other cases, the use case above is very real.

I suggest reading the “Elements of Statistical Learning” by Hastie, Tibshirani, et al.

stochastic_monkonApr 5, 2018

Essentially, he asked if the above poster had read Elements of Statistical Learning, Murphy's ML textbook, Bishop's PRML, Reinforcement Learning: An Introduction, and Ian Goodfellow's Deep Learning textbook.

I simply clarified that the question was about computational learning theory, a subfield largely started by Leslie Valiant in the form of PAC (Probably Approximately Correct) learning. The difference in emphasis between the machine learning conferences I mentioned helps point out how practical machine learning (like ICML, matching PRML/ML/ESL) and feature extraction/representation learning (like ICLR, perhaps matching portions of both ICML and ICLR), while important, are not what the previous poster was asking about.

martincmartinonSep 9, 2013

For those who want a more solid take on machine learning, and who still remember their math and probability/statistics, (i.e. advanced undergrad or new grad student), the best texts seem to be:

The Elements of Statistical Learning by Hastie, Tibshirani and Friedman, available for free on line.

Pattern Recognition and Machine Learning by Chris Bishop. Very Bayesian.

Machine Learning: A Probabilistic Perspective by Kevin Murphy. Also Bayesian, although not as Bayesian as Bishop. The most recent of the three, and therefore covers a few topics not covered elsewhere like deep learning and conditional random fields. The first few printings are full of errors and confusing passages, should be better before too long.

Did I miss any?

hamiltononOct 12, 2009

Lately I've been staring at the Codex Seriphinianus quite a bit. Worth finding a copy if you haven't seen it before.

From a technical point of view, The Elements of Statistical Learning, by Tibshirani, Friedman, and Hastie. Far and away the most illuminating demonstration that so many ML / AI techniques have a long-standing statistical foundation, and, essentially, everything boils down to the linear model.

laichzeit0onAug 26, 2018

They wrote the books Elements of Statistical Learning and Introduction to Statistical Learning in R. Those books are about least squares regression, clustering, decision trees, random forests, boosting, additive models, support vector machines, etc.

All these are common statistical learning methods used in Data Science.

tomrodonJune 19, 2020

Elements of Statistical Learning is a common entry point. I'd argue they are much closer than appears.

As a subfield in computer science, of course the concern has often been on algorithmic complexity and similar. But that is nascency exposed, in my view, and likely not representative of a fully mature field.

Armchair thought (not a historian of economics): I think Econometrics followed (and continues to follow) a similar evolution -- start with the goals (identification of model parameters, identifiability, KPIs), improve statistical validity and relevance, annotate dead ends or less common routes, continue on trucking.

mrileyonNov 14, 2007

I'll second this recommendation - I bought the printed copy, and I'm constantly going to it for reference. The fact that it's available for free is just an added bonus.

I would also suggest Elements of Statistical Learning: http://www-stat.stanford.edu/~tibs/ElemStatLearn/

As well as Duda, Hart, and Stork's Pattern Classification: http://rii.ricoh.com/~stork/DHS.html

pddproonMay 16, 2016

How does this compare to, say "Introduction to Statistical Learning" and "Elements of Statistical Learning" by Trevor et al? As I understand, the former is also supposed to be a concise introduction to statistical concepts while the latter offers a more rigorous treatment. Where does this book fall in between?

pskomorochonJan 15, 2010

Nice lists, I often recommend these for people who want an introduction to the field:

"Mathematical Statistics and Data Analysis" by John A. Rice

"All of Statistics: A Concise Course in Statistics" by Larry Wasserman

"Pattern Recognition and Machine Learning" by Christopher M. Bishop

"The Elements of Statistical Learning" by T. Hastie et al http://www-stat.stanford.edu/~tibs/ElemStatLearn/

"Information Theory, Inference, and Learning Algorithms", David McKay http://www.inference.phy.cam.ac.uk/itprnn/book.html

"Introduction to Information Retrieval" - Manning et al. http://nlp.stanford.edu/IR-book/information-retrieval-book.h...

"The Algorithm Design Manual, 2nd Edition" - Steven Skiena http://www.algorist.com/

nilknonApr 28, 2019

For what it's worth, I disagree quite strongly with that review. The book is aimed at those with a pretty mature appetite for abstract mathematical reasoning, but not much specific knowledge in the areas of statistics, machine learning, and neural networks. It's an actual graduate-level book, and one must approach it with the appropriate background and education.

The Goodfellow book is not complete as an academic intro, but no one book can be. It's not very useful as a practical tutorial, but no book seeking this could cover the mathematical arguments that Goodfellow's book does. I found Goodfellow's book extremely useful for consolidating a lot of handwaving that I'd seen elsewhere and putting it in a slightly more rigorous framework that I could make sense of and immediately work with as a (former) mathematician.

Goodfellow's treatment is especially useful for mathematicians and mathematically-trained practitioners who nevertheless lack a background in advanced statistics. The Elements of Statistical Learning, for instance, is extremely heavy on statistics-specific jargon, and I personally found it far more difficult to extract useful insights from that book than I did from Goodfellow's.

jmountonNov 21, 2009

Unfortunately my favorites are a bit on the mathy side, so you may want wait for other commenters with better advice. But, there are two books I really feel should not be missed. For statistics: "Statistics, Third Edition" David Freedman, Robert Pisani, Roger Purves. For machine learning: "The Elements of Statistical Learning" Trevor Hastie, Robert Tibshirani and Jerome Friedman.

tom_bonJan 25, 2014

Yes, both 'The Elements of Statistical Learning' and 'An Introduction to Statistical Learning with Applications in R' are available free in pdf.

For fans of hard copy, I recently found that if your local (university?) library is a SpringerLink customer, you can purchase a print-on-demand copy of either book for $26.99, which includes shipping. Interior pages are in black and white (including the graphs), but that is a really cheap price for these two.

Andrew Ng's course notes from his physical class at Stanford (CS 229 - Machine Learning) are extensive and available as well at:

http://cs229.stanford.edu/materials.html

ryankupynonAug 4, 2020

I'm a big fan of ISL - one of the best intro machine-learning oriented textbooks out there IMO. If you're looking for book that still offers a broad survey while going a bit deeper into the math, I recommend Elements of Statistical Learning as well (they share 2 authors):

https://web.stanford.edu/~hastie/Papers/ESLII.pdf

earlonFeb 12, 2009

Here are a set of links:

http://www.vetta.org/recommended-reading/

I'd second the recommendation of Bishop if you can hack the math, and also Elements of Statistical Learning, though I wouldn't attempt to learn techniques from the latter so much as look at a very interesting mathematical take on them.

gl

bravuraonFeb 12, 2009

I second the nomination of Bishop. It is the standard text. It is only two years old, and Bishop will teach you machine learning the way that the field practices it nowadays. In my lab of fourteen people, we must have six or so copies of Bishop.

I don't understand what is impractical about Bishop. If you are looking blindly to use an off-the-shelf machine learning implementation, that's one thing. Machine Learning has been described as the study of bias. If you want to understand when to pick certain techniques, and develop appropriate biases, then read Bishop.

"The Elements of Statistical Learning" by Hastie, Tibshirani and Friedman gives more of a statistician's approach. The treatment is simply less broad, and also more dated.

You can also look at Andrew Ng's video lectures: http://www.youtube.com/watch?v=UzxYlbK2c7E
He is very well-respected in the field. For certain students, watching a lecture may be preferable to reading a book.

_deliriumonJuly 24, 2010

Hmm, it depends on what exactly you mean. I was going to answer "often", but from the description it sounds like you have in mind the kinds of books that come from technical publishers like O'Reilly, which I read closer to "rarely/never".

For languages, frameworks, tools, etc., I almost never read books. But I do read technical books on concepts, ideas, research areas, techniques, etc. So e.g. in computational statistics / ML, I've never bought/read a book like "R in Action", but I do own "The Elements of Statistical Learning".

me2i81onJune 4, 2008

I thought the lack of references was a major failing of the book. Some of the algorithms barely scratch the surface, and it does a disservice to the reader to provide noplace to go. Here's a few to get started: 1. Russel and Norvig's AI text, 2. Elements of Statistical Learning by Hastie et. al., 3. Pattern Recognition and Machine Learning by Chris Bishop.

On the other hand going right into code examples is useful, including jumping right into getting real data downloaded and worked on.

DrbbleonMar 27, 2012

Parent is talking about Agent systems and just about everything that isn't regular statistics. Basically, in the past 20 years, computers got 1000 times smarter and people didn't, so old statistical models became tractable to apply to terabytes of data, and the schools of "invent a thinking algorithm" stopped being relevant.

/slightly bitter former "academic AI" student.

It's not really Academic vs Industry, though. It is Agents and Logic vs Statistics.

The standard text is Elements of Statistical Learning. It is a grad-level and mostly theory. For goofing around in Python, Programming Collective Intelligence

Built withby tracyhenry

.

Follow me on