An Introduction to Statistical Learning: with Applications in R (Springer Texts in Statistics)
Gareth James , Daniela Witten , et al.
4.8 on Amazon
72 HN comments
Mastering Regular Expressions
Jeffrey E. F. Friedl
4.6 on Amazon
72 HN comments
Game Programming Patterns
Robert Nystrom
4.8 on Amazon
68 HN comments
Steve Jobs
Walter Isaacson, Dylan Baker, et al.
4.6 on Amazon
67 HN comments
Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series)
Kevin P. Murphy
4.3 on Amazon
66 HN comments
The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage
Cliff Stoll, Will Damron, et al.
4.7 on Amazon
61 HN comments
Programming: Principles and Practice Using C++ (2nd Edition)
Bjarne Stroustrup
4.5 on Amazon
58 HN comments
Ghost in the Wires: My Adventures as the World’s Most Wanted Hacker
Kevin Mitnick, William L. Simon, et al.
4.6 on Amazon
55 HN comments
Modern Operating Systems
Andrew Tanenbaum and Herbert Bos
4.3 on Amazon
54 HN comments
Head First Design Patterns: Building Extensible and Maintainable Object-Oriented Software 2nd Edition
Eric Freeman and Elisabeth Robson
4.7 on Amazon
52 HN comments
The Singularity Is Near: When Humans Transcend Biology
Ray Kurzweil, George Wilson, et al.
4.4 on Amazon
51 HN comments
The Everything Store: Jeff Bezos and the Age of Amazon
Brad Stone, Pete Larkin, et al.
4.6 on Amazon
51 HN comments
Compilers: Principles, Techniques, and Tools
Alfred Aho, Monica Lam, et al.
4.1 on Amazon
50 HN comments
Test Driven Development: By Example
Kent Beck
4.4 on Amazon
45 HN comments
Patterns of Enterprise Application Architecture
Martin Fowler
4.5 on Amazon
43 HN comments
criddellonDec 4, 2018
Also, I'd love to hear about your favorite online courses. I took Andrew Ng's Machine Learning and Dan Boneh's Cryptography (both on Coursera) and they were excellent.
KerrickStaleyonApr 3, 2018
pietroppeteronJuly 3, 2020
mlwhizonSep 3, 2019
judkonMar 8, 2014
Andrew Ng's Machine Learning lecture notes are world famous.
colundonMar 15, 2016
dimaturaonJuly 20, 2014
notimetorelaxonOct 3, 2012
Oh well, to be fair I would donate quite a lot for each course that I enjoyed.
smoyeronApr 3, 2018
mancaonNov 29, 2020
rivaldoonOct 10, 2016
elibryanonMay 22, 2010
p1eskonJune 6, 2016
fjellfrasonJune 11, 2012
theuncommononApr 4, 2018
tomaskazemekasonJune 28, 2014
dbeckeronJan 31, 2014
denzil_correaonMar 12, 2013
http://nltk.org/book/ch06.html
ThomasCharlesonAug 16, 2011
Machine Learning - Professor Andrew Ng - http://www.ml-class.org
Introduction to Databases - Professor Jennifer Windom - http://www.db-class.org
Introduction to Artificial Intelligence - Professor Sebastian Thrun and Dr. Peter Norvig - http://www.ai-class.com
knnonMar 16, 2016
tonyedgecombeonJan 16, 2017
The maths is fairly straightforward and the concepts are explained well.
YadionJune 10, 2015
So this book must be awesome and I also have been looking around to find more on model-based ML stuff to read.
I guess this is a part of the 2013 Microsoft Research[1] paper
---------------
[1] http://research.microsoft.com/en-us/um/people/cmbishop/downl...
salusinarduisonApr 9, 2015
* Machine Learning - Peter Flach
* Guns, Germs, and Steel - Jared Diamond
* Dune ( for the third time, I love it :) ) - Frank Herbert
I usually read some of pg's, sama's, avc's or some other prominant essayist in the bath each week.
flor1sonMar 18, 2017
This alongside Andrew Ng's Machine Learning course was my first exposure to the field. https://www.coursera.org/learn/machine-learning
I can also recommend Sebastian Thrun's Artificial Ingelligence for Robotics course: https://www.udacity.com/course/artificial-intelligence-for-r...
CodeGlitchonApr 6, 2020
It's using scikit-learn, which I've been meaning to pick-up for awhile. Other than that, learning how to use the ELK stack (ElasticSearch, Kibana) for some network-traffic analysis stuff I've been meaning to implement.
I think we'll all come out of this period a lot smarter :)
[1] https://pluralsight.com
rrampageonApr 22, 2020
1. Introduction to the Theory of Computation - Sipser and Introduction to Algorithms - CLRS (for the Algorithms course)
2. Machine Learning - Tom Mitchell
3. Artificial Intelligence - A Modern Approach - Russell and Norvig
4. Design of Everyday Things - Norman (for HCI)
5. Operating Systems in Three Easy Pieces [0]
Most of the courses had their own notes, slides and suggested research papers as primary reading and the textbooks were mostly used as a secondary reference.
[0] - http://pages.cs.wisc.edu/~remzi/OSTEP/
kjainonMar 11, 2015
It is one of the best places to start. Please see that the course will require you to spend considerable time. If you find this challenging, you can also look at the Machine Learning class by Andrew Ng on Coursera.
Once you have undergone these courses, you can take up Coursera course on Neural Nets or look at tutorials on deeplearning.net
jmzacharyonNov 13, 2007
Programming Collective Intelligence by Toby Segaran for a practical approach in Python.
pmayrgundteronDec 20, 2020
https://github.com/pablo-mayrgundter/freality/tree/master/ml...
allenleeinonOct 15, 2016
1. Machine Learning by Andrew Ng (https://www.coursera.org/learn/machine-learning) /// Class notes: (http://holehouse.org/mlclass/index.html)
2. Yaser Abu-Mostafa’s Machine Learning course which focuses much more on theory than the Coursera class but it is still relevant for beginners.(https://work.caltech.edu/telecourse.html)
3. Neural Networks and Deep Learning (Recommended by Google Brain Team) (http://neuralnetworksanddeeplearning.com/)
4. Probabilistic Graphical Models (https://www.coursera.org/learn/probabilistic-graphical-model...)
4. Computational Neuroscience (https://www.coursera.org/learn/computational-neuroscience)
5. Statistical Machine Learning (http://www.stat.cmu.edu/~larry/=sml/)
If you want to learn AI:
https://medium.com/open-intelligence/recommended-resources-f...
jahanonSep 3, 2015
indigentmartianonJan 21, 2017
Regarding books, there are many very high quality textbooks available (legitimately) for free online:
Introduction to Statistical Learning (James et al., 2014) http://www-bcf.usc.edu/~gareth/ISL/
the above book shares some authors with the denser and more in-depth/advanced
The Elements of Statistical Learning (Hastie et al., 2009) http://statweb.stanford.edu/~tibs/ElemStatLearn/
Information Theory: Inference & Learning Algorithms (MacKay, 2003) http://www.inference.phy.cam.ac.uk/itila/p0.html
Bayesian Reasoning & Machine Learning (Barber, 2012) http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=...
Deep Learning (Goodfellowet al., 2016) http://www.deeplearningbook.org/
Reinforcement Learning: An Introduction (Sutton & Barto, 1998) http://webdocs.cs.ualberta.ca/~sutton/book/ebook/the-book.ht...
^^ the above books are used on many graduate courses in machine learning and are varied in their approach and readability, but go deep into the fundamentals and theory of machine learning. Most contain primers on the relevant maths, too, so you can either use these to brush up on what you already know or as a starting point look for more relevant maths materials.
If you want more practical books/courses, more machine-learning focussed data science books can be helpful. For trying out what you've learned, Kaggle is great for providing data sets and problems.
sampoonOct 20, 2012
So do you think these lectures are not enough to bring one up to speed in applying these concepts in linear regression?
Of course, a formally educated person has taken a full semester of Linear Algebra, and solved dozens of homeworks of "transpose this", "invert that" etc. so it's difficult to guess how much homework of the boring kind would be needed before one is able to apply these concepts in problem solving.
jdjonApr 6, 2011
arohneronOct 14, 2009
McKay's looks like a good book, but appears more applicable to pure information theory rather than ML specifically.
rohitarondekaronMar 30, 2015
[1]: Roughgarden's Algorithms part I, Programming Languages by Dan Grossman and Machine Learning by Andrew Ng.
mindcrimeonFeb 22, 2016
In addition, if you don't already have a background in Calculus and Linear Algebra, then supplement the Ng course with the Khan Academy stuff on Calculus and Linear Algebra, or other courses you can pick up on Coursera or Edx or whatever.
If you get really interested in neural networks (which are all the rage these days) after the Ng class, there's a freely available book on Neural Network design that you could look at. It doesn't cover all the very latest techniques, but it would help you build the foundation of understanding.
http://hagan.okstate.edu/nnd.html
There's also a MOOC around the Learning From Data book that you could check out.
http://amlbook.com/
https://work.caltech.edu/telecourse.html
OTOH, if you're making a sharp distinction between "Classic AI" and "Machine Learning" and you really care mainly about the classical stuff, then you might want to start with the Berkeley CS188 class. You can take it through EdX (https://www.edx.org/course/artificial-intelligence-uc-berkel...) or just watch the videos and download the notes and stuff from http://ai.berkeley.edu/home.html
And if you just want to dive into reading some classic papers and stuff, check out:
http://publications.csail.mit.edu/ai/pubs_browse.shtml
and/or
http://ijcai.org/past_proceedings
Another good resource is
http://aitopics.org/
ramblenodeonAug 28, 2016
1. "Machine Learning" - Murphy
It's a classic and a rigorous introduction to the subject. It focuses on theory over implementation so it's more for developing a principled understanding of machine learning fundamentals than for getting you up to speed with modern tools necessary for solving real problems.
2. "Neural Networks and Deep Learning" - Nielson
This is a free online book that is very accessible and engaging. It covers basic theory and implements examples in Python.
mustafafonFeb 4, 2011
Good References:
1) Elements of Statistical Learning - Hastie, Tibshirani and Friedman
2) Pattern Classification - Duda, Hart and Stork
3) Pattern Recognition - Theoridis, Koutroumbas
4) Machine Learning - Tom Mitchell
5) http://videolectures.net/Top/Computer_Science/Machine_Learni...
saravana85onOct 12, 2020
guessmynameonAug 28, 2016
sampoonDec 12, 2013
https://class2go.stanford.edu/db/Winter2013/preview/
I have taken some MOOCs in past years, including Andrew Ng's Machine Learning, and Martin Odersky's Functional Programming Principles in Scala, and I have to say that Jennifer Widom's database class is the most streamlined, most well thought, and most smooth of all the MOOCs I have seen.
Everything works, the lectures are packed with information and give sufficient material for one to complete the exercises, there are no rough edges, and there is a large-ish amount of exercises that give good coverage of the lectured material.
e-dardonOct 25, 2012
Also, in terms of applying ML, you could scoot through Andrew Ng's Machine Learning Coursera course (not sure if it's running at the moment, though).
Finally... One tip – typically I have always found that when you want to start apply ML to real-world problemss, start simple and only iterate when the results of your approach are not satisficing. This is usually because all the bleeding edge ML research/techniques don't consider a shit-load of real-world issues, like scaleability, applicability to wide-range of problem, unstructured or noisy data and so on.
YeGoblynQueenneonNov 7, 2020
Edit: I'm asking because there is evident in your comment a confusion that is all too common on the internets today, of thinking of "GOFAI" (symbolic AI) and "ML" (machine learning) as two somehow incompatible and perhaps exclusive sub-fields of AI. This is as far from the truth as it could be, for instance machine learning really took off as a subject of research in the 1980's with the work of Ryszard Michalski and others, who considered machine learning approaches for the purpose of overcoming the "knowledge acquisition bottleneck", i.e. the difficulty of hand-crafting production rules for expert systems. Indeed, most of the early machine learning systems were propositional-logic based, i.e. symbolic. And of course, one of the most widely used and well-known type of machine learning approach used to this day, decision trees, hail from that time and also learn propositional logic, symbolic models.
Of course, most people today know "ML" as a byword for deep learning, or at best statistical pattern recognition (if that). It's just another aberration brought on by the sudden explosion of interest in a once small field.
I refer you to Michalski's textbook, "Machine Learning: An Artificial Intelligence Approach" and Tom Mitchell's "Machine Learning" for more information on the early days of the field, i.e. until ca. 2000.
merittonMar 29, 2017
https://www.coursera.org/learn/machine-learning
wirrbelonFeb 26, 2018
Have you looked at the monograph at all? This could have been written 5 to 10 years ago, 15 years ago, maybe a few sections would have looked a little different. In fact, I like to recommend Mitchell's Machine Learning book (I think it was written in the 90s) as an introduction to people with a serious interest.
There currently is a lot of hype going on for machine learning algorithms, because we see good progress in things like computer vision / pattern recognition. This kicks of a marketing machinery that really blurs the reality.
In reality, we have an established body of methods and modelling techniques that are sufficient, because the available data is the bottleneck for prediction quality. The actual challenge is to come up with a valuable business proposition, not necessarily to build the predictive model.
martingoodsononJune 3, 2013
Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series) by Kevin Murphy
zumdaonJuly 17, 2012
The lessons are interesting, and I really like their concept of quizzes in the middle of the lessons. They just show the question on the video, he explains it, and they stop the video and you can choose which one is right, or type in the number.
The homework is also well done and enhances the knowledge from the course. I had a lot of fun doing the lectures each and every week. You probably want to put aside a set day and time when you will do it, though, or else it is easy to do other things and "do the lectures later".
If they continue like this, there is soon no reason to go to a university anymore. :) (which is a catch-22 since the lecturers are payed by universities...)
allthingonApr 21, 2019
From section 4.6.2 of Tom Mitchell's Machine Learning book:
"Arbitrary functions. Any function can be approximated to arbitrary accuracy by a network with three layers of units (Cybenko 1988)."
karthikmonOct 14, 2009
jat850onJan 18, 2013
I signed up for Andrew's ML course (https://class.coursera.org/ml/lecture/preview , start date is not announced yet). Really looking forward to it.
jezclaremuruganonNov 17, 2011
jwponApr 3, 2012
lovelearningonJune 28, 2015
Once you're familiar with the basics, you can go deeper into the subject with the books suggested here.
[1]: https://www.coursera.org/learn/machine-learning
nikhizzleonMar 3, 2014
I can recommend a few professors who really opened my mind to how to use math, all at UCSC (CS grad school for me):
- Dave Helmbold (Machine Learning)
- Kevin Ross (Operations Research)
- Martin Abadi (Security)
waterlesscloudonOct 27, 2015
It's as good a place to start as any, and the benefit of a scheduled class is that you'll have a community doing the same work at the same time to help you out.
https://www.coursera.org/learn/machine-learning/
webhatonJune 28, 2014
IMHO if course community were a large factor this high trafficked course with a high level of community discussion would most likely have exceeded this. As you say later ML courses have had much community discussion, yet they have had lower completion rates.
At the moment there is no substitute for real interaction with the docent or professor.
the_real_r2d2onOct 20, 2009
Also I accompanied my learning with some other books (i.e. Machine Learning from Tom Mitchell) and academic papers that filled the theory gap.
As a started point to teach the basics of ML and to encourage to go and learn more, I think PCI is very good.
wirthjasononAug 12, 2021
Any other suggestions for good Packt books?
I agree, Packt’s quality is much lower than other publishers. As a rule of thumb I stay away but occasionally there’s a gem.
I’ve been looking at “Machine Learning for Algorithmic Trading”. It feels like a dump of wikipedia and a bunch of jupyter notebooks with sloppy code. I cannot decide if it’s worth the pain if slogging through that mess.
https://www.amazon.com/Machine-Learning-Algorithmic-Trading-...
mindcrimeonAug 9, 2015
So, what would I make the new "vertical bar" of my own "T"? I'm leaning towards two somewhat parallel tracks: And both of these involve things that were on my "horizontal bar" going back sometime anyway, it's just time to start emphasizing them more. One, is Semantic Web / Artificial Intelligence / Cognitive Computing stuff. And related, but not necessarily exactly the same is Big Data / Machine Learning / Analytics / BI stuff.
I expect skills in those areas are going to be valuable for some time to come. I started digging into the SemWeb stuff several years ago, actually, but I wish now that I'd started investing more into machine learning a few years earlier.
As far as how to do that? Well, read and experiment and build sample projects for yourself... the same things you'd do to learn any new skill. There are a number of good books and websites out there. I'm reading Machine Learning for Hackers now, and also playing around with things like Mahout, OpenNLP, Giraph, and Spark with GraphX and MLib. I'm also looking into learning R, as it's widely used in that analytics / machine learning world.
mauritsonAug 28, 2011
- Machine Learning by Tom M Mitchell
http://www.cs.cmu.edu/~tom/mlbook.html
For general reading and introductions I also like:
- Pattern Classification by Richard Duda
- Pattern Recognition and Machine Learning by
Christopher Bishop
For a bit more emphasis on statistics and math, I usually dive in to
- Classification,Parameter Estimation and State Estimation by van der Heijden
And last, but certainly not least:
- Information Theory, Inference, and Learning Algorithms by
David MacKay, available here:
http://www.inference.phy.cam.ac.uk/mackay/itila/
gtanionApr 4, 2018
If you look in Goodfellow et al's Deep Learning book, Murphy's Machine Learning text and others mentioned here (Learning from Data, Shalev-Shwartz/Ben-David) the prereq's are always some variation of above and I think you could do a lot of the above at U.S. community colleges, at least the CC's around me.
Frankly, you'd have to do a bunch of self study beyond CC and there's no shortcut/royal road. So the key is self study, that's a discipline anyone that wants to do Data Science/machine learning for real needs
boniface316onJan 17, 2017
https://www.coursera.org/learn/machine-learning
zintinio4onNov 18, 2015
I was forced to learn a massive amount in a short period of time for work, but I'd previously watched Andrew Ng's lectures, as well as majored in Math/CS. I can also generally recommend Hinton's NN lectures, Socher's Deep learning for NLP, Andrew Ng's Machine Learning, and a few books.
samg_onNov 15, 2012
I have spent a lot of time on Khan Academy to learn the calculus. In my experience you can get by with a surprisingly small amount of calculus, but it happens to be a small amount from a high level.
For example, backpropagation is just repeated application of the chain rule. Did take a while to get a handle on the derivatives, but it's worth it.
sindoconAug 24, 2011
- Lambda Calculus
- Reflective and Meta-programming
- Meta-object Protocol
- Closures
- Continuations
- Monads
- Arrows
- First-class Everything
- Stack and Register-based Programming
- XML
- Linear Algebra
- Fractal and Wavelet Image Compression
- Regular Expressions
- Clojure
- LaTeX
NB. The concepts to which the above keywords refer, may or may not have been covered by the article. The keywords themselves are however absent.
Additional reading suggestions:
- Jon Bentley's Programming Pearls
- Tom Mitchell's Machine Learning
- Douglas Hofstadter's Gödel, Escher, Bach
- Brian Kernighan and Rob Pike's Unix Programming Environment