HackerNews Readings
40,000 HackerNews book recommendations identified using NLP and deep learning

Scroll down for comments...

An Introduction to Statistical Learning: with Applications in R (Springer Texts in Statistics)

Gareth James , Daniela Witten , et al.

4.8 on Amazon

72 HN comments

Mastering Regular Expressions

Jeffrey E. F. Friedl

4.6 on Amazon

72 HN comments

Game Programming Patterns

Robert Nystrom

4.8 on Amazon

68 HN comments

Steve Jobs

Walter Isaacson, Dylan Baker, et al.

4.6 on Amazon

67 HN comments

Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series)

Kevin P. Murphy

4.3 on Amazon

66 HN comments

The Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage

Cliff Stoll, Will Damron, et al.

4.7 on Amazon

61 HN comments

Programming: Principles and Practice Using C++ (2nd Edition)

Bjarne Stroustrup

4.5 on Amazon

58 HN comments

Ghost in the Wires: My Adventures as the World’s Most Wanted Hacker

Kevin Mitnick, William L. Simon, et al.

4.6 on Amazon

55 HN comments

Modern Operating Systems

Andrew Tanenbaum and Herbert Bos

4.3 on Amazon

54 HN comments

Head First Design Patterns: Building Extensible and Maintainable Object-Oriented Software 2nd Edition

Eric Freeman and Elisabeth Robson

4.7 on Amazon

52 HN comments

The Singularity Is Near: When Humans Transcend Biology

Ray Kurzweil, George Wilson, et al.

4.4 on Amazon

51 HN comments

The Everything Store: Jeff Bezos and the Age of Amazon

Brad Stone, Pete Larkin, et al.

4.6 on Amazon

51 HN comments

Compilers: Principles, Techniques, and Tools

Alfred Aho, Monica Lam, et al.

4.1 on Amazon

50 HN comments

Test Driven Development: By Example

Kent Beck

4.4 on Amazon

45 HN comments

Patterns of Enterprise Application Architecture

Martin Fowler

4.5 on Amazon

43 HN comments

Prev Page 2/16 Next
Sorted by relevance

criddellonDec 4, 2018

What's the difference between Udacity and Udemy? Are they the same company?

Also, I'd love to hear about your favorite online courses. I took Andrew Ng's Machine Learning and Dan Boneh's Cryptography (both on Coursera) and they were excellent.

KerrickStaleyonApr 3, 2018

I really liked Geoff Hinton's Neural Networks for Machine Learning (https://www.coursera.org/learn/neural-networks). It goes into a lot of depth (much more so than Andrew Ng's Machine Learning course) and is fairly challenging.

pietroppeteronJuly 3, 2020

Tom Mitchell book is still a great book to understand what Machine Learning is about

mlwhizonSep 3, 2019

Machine Learning by Andrew ng for sure and by far the best.

judkonMar 8, 2014

The best courses I took were ones where the lecturer published lecture notes. That shows the professor actualy created lecture notes!

Andrew Ng's Machine Learning lecture notes are world famous.

colundonMar 15, 2016

I enjoyed Andrew Ng's Machine Learning course on Coursera. Why don't you give it a shot.

dimaturaonJuly 20, 2014

I would also suggest K. Murphy's Machine Learning for the journeyman level. In the intermediate apprentice-journeyman level Alpaydin's Introduction to Machine Learning is very friendly.

notimetorelaxonOct 3, 2012

Yeap we got spoiled with earlier classes: Algorithms by Tim Roughgarden, Machine Learning by Andrew Ng, and many more. We probably need to follow a class on gratitude.

Oh well, to be fair I would donate quite a lot for each course that I enjoyed.

smoyeronApr 3, 2018

Agreed Ng's Machine Learning and Odersky's FP in Scala were my favorites. I'm looking for a good bioinformatics course at the moment. I wrote a small program for my daughter that attempts to find CRISPR sites for my daughter and it would be great to know more of the background.

mancaonNov 29, 2020

Machine Learning by Dr Ng is a classic. I also loved Algorithms I and II by Tim Roughgarden. Sedgewick's Algorithms are also good.

rivaldoonOct 10, 2016

I have been learning about Machine Learning via Michael Nielsens book (http://neuralnetworksanddeeplearning.com) and I can't recommended it enough. Fantastic content, completely free, very well explained.

elibryanonMay 22, 2010

Tom Mitchell's Machine Learning is also very good (despite being a bit older) http://www.cs.cmu.edu/~tom/mlbook.html

p1eskonJune 6, 2016

For a good conceptual outline you should definitely check out neuralnetworksanddeeplearning.com by Michael Nielsen. Then if you want to go deeper, look for Machine Learning course on Coursera by Pedro Domingos.

fjellfrasonJune 11, 2012

Sure, I started off with Andrew Ng's course on coursera. Then I started with the book called Machine Learning by Tom Mitchell. I also have the PCI book to supplement Mitchell's book with code examples. I got Bishop's book too but to be honest I'm finding it a little harder to follow than the others.

theuncommononApr 4, 2018

Machine Learning by Andrew Ng on Coursera is the best MOOC I've taken so far. It has great explanations on complex topics, fun activities, and a really well put together curriculum on machine learning.

tomaskazemekasonJune 28, 2014

One way of dealing with the teaching assistance shortage during the course is community discussion forums. The best implementation of it I found are on Udacity and Coursera. Another excellent teaching aid is Community Teaching Assistants on some of the Coursera courses, e. g. Machine Learning by Andrew Ng.

dbeckeronJan 31, 2014

Not a journal or blog, but I highly recommend Andrew Ng's Machine Learning course on Coursera.

denzil_correaonMar 12, 2013

In terms of Machine Learning for text data, Chapter 6 is highly recommended.

http://nltk.org/book/ch06.html

ThomasCharlesonAug 16, 2011

For those interested:

Machine Learning - Professor Andrew Ng - http://www.ml-class.org

Introduction to Databases - Professor Jennifer Windom - http://www.db-class.org

Introduction to Artificial Intelligence - Professor Sebastian Thrun and Dr. Peter Norvig - http://www.ai-class.com

knnonMar 16, 2016

AI by Russell and Norvig. Machine learning by Murphy, Elements of Statistical Learning by Hastie et al. Just a few good ones out of many!

tonyedgecombeonJan 16, 2017

Machine Learning by Andrew Ng https://www.coursera.org/learn/machine-learning

The maths is fairly straightforward and the concepts are explained well.

YadionJune 10, 2015

My first introduction to Machine Learning was C. Bishop's book!

So this book must be awesome and I also have been looking around to find more on model-based ML stuff to read.

I guess this is a part of the 2013 Microsoft Research[1] paper

---------------

[1] http://research.microsoft.com/en-us/um/people/cmbishop/downl...

salusinarduisonApr 9, 2015

Currently reading:

* Machine Learning - Peter Flach

* Guns, Germs, and Steel - Jared Diamond

* Dune ( for the third time, I love it :) ) - Frank Herbert

I usually read some of pg's, sama's, avc's or some other prominant essayist in the bath each week.

flor1sonMar 18, 2017

The course is also quite easy to follow without buying the book. I love the exercises in which you are programming an intelligent agent to move through a maze. It reminded me of how we learned programming in university using Karel The Robot.

This alongside Andrew Ng's Machine Learning course was my first exposure to the field. https://www.coursera.org/learn/machine-learning

I can also recommend Sebastian Thrun's Artificial Ingelligence for Robotics course: https://www.udacity.com/course/artificial-intelligence-for-r...

CodeGlitchonApr 6, 2020

Janani Ravi's Machine Learning course on PluralSight[1]

It's using scikit-learn, which I've been meaning to pick-up for awhile. Other than that, learning how to use the ELK stack (ElasticSearch, Kibana) for some network-traffic analysis stuff I've been meaning to implement.

I think we'll all come out of this period a lot smarter :)

[1] https://pluralsight.com

rrampageonApr 22, 2020

Sure.

1. Introduction to the Theory of Computation - Sipser and Introduction to Algorithms - CLRS (for the Algorithms course)

2. Machine Learning - Tom Mitchell

3. Artificial Intelligence - A Modern Approach - Russell and Norvig

4. Design of Everyday Things - Norman (for HCI)

5. Operating Systems in Three Easy Pieces [0]

Most of the courses had their own notes, slides and suggested research papers as primary reading and the textbooks were mostly used as a secondary reference.

[0] - http://pages.cs.wisc.edu/~remzi/OSTEP/

kjainonMar 11, 2015

I would recommend going through the CalTech Course - Learning from Data run by Yasir Abu Mustafa as the first step. https://www.edx.org/course/learning-data-caltechx-cs1156x#.V...

It is one of the best places to start. Please see that the course will require you to spend considerable time. If you find this challenging, you can also look at the Machine Learning class by Andrew Ng on Coursera.

Once you have undergone these courses, you can take up Coursera course on Neural Nets or look at tutorials on deeplearning.net

jmzacharyonNov 13, 2007

Machine Learning by Tom Mitchell for a serious academic book.

Programming Collective Intelligence by Toby Segaran for a practical approach in Python.

pmayrgundteronDec 20, 2020

Here's another along the same lines, with a demo learning faces according to the Machine Learning book by Tom Mitchell@CMU:

https://github.com/pablo-mayrgundter/freality/tree/master/ml...

allenleeinonOct 15, 2016

Courses You MUST Take:

1. Machine Learning by Andrew Ng (https://www.coursera.org/learn/machine-learning) /// Class notes: (http://holehouse.org/mlclass/index.html)

2. Yaser Abu-Mostafa’s Machine Learning course which focuses much more on theory than the Coursera class but it is still relevant for beginners.(https://work.caltech.edu/telecourse.html)

3. Neural Networks and Deep Learning (Recommended by Google Brain Team) (http://neuralnetworksanddeeplearning.com/)

4. Probabilistic Graphical Models (https://www.coursera.org/learn/probabilistic-graphical-model...)

4. Computational Neuroscience (https://www.coursera.org/learn/computational-neuroscience)

5. Statistical Machine Learning (http://www.stat.cmu.edu/~larry/=sml/)

If you want to learn AI:
https://medium.com/open-intelligence/recommended-resources-f...

jahanonSep 3, 2015

Top ML/DM Books - in this post we have collected various signals (e.g. online reviews, online ratings, book price, & so on) for 100's of Machine Learning & Data Mining books. We have used those signals to compute a quality score for each book and rank the top first 16 Machine Learning, Data Mining and NLP books. Our ranking approach is objective, data-driven and fair. Enjoy the list!

indigentmartianonJan 21, 2017

Andrew Ng's Coursera course simply titled "Machine Learning" is good - it addresses the mathematics of fundamental algorithms and concepts while giving practical examples and applications: https://www.coursera.org/learn/machine-learning

Regarding books, there are many very high quality textbooks available (legitimately) for free online:

Introduction to Statistical Learning (James et al., 2014) http://www-bcf.usc.edu/~gareth/ISL/

the above book shares some authors with the denser and more in-depth/advanced

The Elements of Statistical Learning (Hastie et al., 2009) http://statweb.stanford.edu/~tibs/ElemStatLearn/

Information Theory: Inference & Learning Algorithms (MacKay, 2003) http://www.inference.phy.cam.ac.uk/itila/p0.html

Bayesian Reasoning & Machine Learning (Barber, 2012) http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=...

Deep Learning (Goodfellowet al., 2016) http://www.deeplearningbook.org/

Reinforcement Learning: An Introduction (Sutton & Barto, 1998) http://webdocs.cs.ualberta.ca/~sutton/book/ebook/the-book.ht...

^^ the above books are used on many graduate courses in machine learning and are varied in their approach and readability, but go deep into the fundamentals and theory of machine learning. Most contain primers on the relevant maths, too, so you can either use these to brush up on what you already know or as a starting point look for more relevant maths materials.

If you want more practical books/courses, more machine-learning focussed data science books can be helpful. For trying out what you've learned, Kaggle is great for providing data sets and problems.

sampoonOct 20, 2012

Andrew Ng's Machine Learning course at Coursera, week 1 contains about 1 hour Linear Algebra review: lectures on vectors, matrices, their multiplication, transpose and inverse.

So do you think these lectures are not enough to bring one up to speed in applying these concepts in linear regression?

Of course, a formally educated person has taken a full semester of Linear Algebra, and solved dozens of homeworks of "transpose this", "invert that" etc. so it's difficult to guess how much homework of the boring kind would be needed before one is able to apply these concepts in problem solving.

jdjonApr 6, 2011

One book that I would suggest to anyone is Introduction to Automata Theory, Languages, and Computation - HMU. It is very approachable and presents some very interesting topics (so you won't write a regex for matching HTML and will learn what P vs NP means). On a more practical side, I think that a must read for machine learning is Tom Mitchell - Machine Learning . Another book that from what I've heard is easier to digest is Data Mining: Practical Machine Learning Tools and Techniques.

arohneronOct 14, 2009

I'm mainly interested in Machine Learning algorithms. I've read the first 75 pages of this book, and skimmed most of Mackay, and I prefer this one. It goes into a lot more detail on the performance of different algorithms, how the handle sparse data, error rates and high dimensionality.

McKay's looks like a good book, but appears more applicable to pure information theory rather than ML specifically.

rohitarondekaronMar 30, 2015

I've started many MOOC's but never finished because I couldn't keep up with watching the videos. Not being able to flip back and forth is really limiting. The only courses[1] I've finished had good quality notes and completely depended on them and avoided the lecture videos.

[1]: Roughgarden's Algorithms part I, Programming Languages by Dan Grossman and Machine Learning by Andrew Ng.

mindcrimeonFeb 22, 2016

Assuming that you consider Machine Learning to be either a subset of AI (as I do) or a sibling field, and want to learn aspects of ML, then consider Andrew Ng's Machine Learning course on Coursera. It's a great introduction and doesn't require a ton in terms of prerequisites. You'll see some multi-variable calculus and linear algebra, but he does the calculus derivations for you, and there's a pretty adequate review of the relevant parts of Linear Algebra.

In addition, if you don't already have a background in Calculus and Linear Algebra, then supplement the Ng course with the Khan Academy stuff on Calculus and Linear Algebra, or other courses you can pick up on Coursera or Edx or whatever.

If you get really interested in neural networks (which are all the rage these days) after the Ng class, there's a freely available book on Neural Network design that you could look at. It doesn't cover all the very latest techniques, but it would help you build the foundation of understanding.

http://hagan.okstate.edu/nnd.html

There's also a MOOC around the Learning From Data book that you could check out.

http://amlbook.com/

https://work.caltech.edu/telecourse.html

OTOH, if you're making a sharp distinction between "Classic AI" and "Machine Learning" and you really care mainly about the classical stuff, then you might want to start with the Berkeley CS188 class. You can take it through EdX (https://www.edx.org/course/artificial-intelligence-uc-berkel...) or just watch the videos and download the notes and stuff from http://ai.berkeley.edu/home.html

And if you just want to dive into reading some classic papers and stuff, check out:

http://publications.csail.mit.edu/ai/pubs_browse.shtml

and/or

http://ijcai.org/past_proceedings

Another good resource is

http://aitopics.org/

ramblenodeonAug 28, 2016

Two that I have enjoyed:

1. "Machine Learning" - Murphy

It's a classic and a rigorous introduction to the subject. It focuses on theory over implementation so it's more for developing a principled understanding of machine learning fundamentals than for getting you up to speed with modern tools necessary for solving real problems.

2. "Neural Networks and Deep Learning" - Nielson

This is a free online book that is very accessible and engaging. It covers basic theory and implements examples in Python.

mustafafonFeb 4, 2011

If you really want to learn the fundamental underpinnings of machine learning, you will need a strong background in probability and stochastic processes. I would suggest Python (or MATLAB if you can get access to it) to learn how different methods works. That way you can separate mathematical issues from programming issues. As far as courses go, you should be looking for courses in Liner Algebra, Numerical Computation/Optimization (Convex, Nonlinear), Statistical Inference, Stochastic Processes.

Good References:
1) Elements of Statistical Learning - Hastie, Tibshirani and Friedman

2) Pattern Classification - Duda, Hart and Stork

3) Pattern Recognition - Theoridis, Koutroumbas

4) Machine Learning - Tom Mitchell

5) http://videolectures.net/Top/Computer_Science/Machine_Learni...

saravana85onOct 12, 2020

i like Machine Learning (in Python and R) for Dummies by John Paul Muelle and Luca Massaron

guessmynameonAug 28, 2016

    Machine Learning Recipes with Josh Gordon
A series of videos baked by Google Developers' YouTube channel
https://www.youtube.com/playlist?list=PLOU2XLYxmsIIuiBfYad6rFYQU_jL2ryal

sampoonDec 12, 2013

The Stanford database class should be sill open for self study. It contains video lectures, and lots of database query exercises. For the exercises, you can write the query code in the web browser.

https://class2go.stanford.edu/db/Winter2013/preview/

I have taken some MOOCs in past years, including Andrew Ng's Machine Learning, and Martin Odersky's Functional Programming Principles in Scala, and I have to say that Jennifer Widom's database class is the most streamlined, most well thought, and most smooth of all the MOOCs I have seen.

Everything works, the lectures are packed with information and give sufficient material for one to complete the exercises, there are no rough edges, and there is a large-ish amount of exercises that give good coverage of the lectured material.

e-dardonOct 25, 2012

Hi, machine learning PhD here - one way you can start is by brushing up on some fundamentals. A book such as Machine Learning, by Thom M. Mitchell is a reasonable start.

Also, in terms of applying ML, you could scoot through Andrew Ng's Machine Learning Coursera course (not sure if it's running at the moment, though).

Finally... One tip – typically I have always found that when you want to start apply ML to real-world problemss, start simple and only iterate when the results of your approach are not satisficing. This is usually because all the bleeding edge ML research/techniques don't consider a shit-load of real-world issues, like scaleability, applicability to wide-range of problem, unstructured or noisy data and so on.

YeGoblynQueenneonNov 7, 2020

Amicably, can I ask you how you know what you say above to be true? Where does your knowledge of AI come from?

Edit: I'm asking because there is evident in your comment a confusion that is all too common on the internets today, of thinking of "GOFAI" (symbolic AI) and "ML" (machine learning) as two somehow incompatible and perhaps exclusive sub-fields of AI. This is as far from the truth as it could be, for instance machine learning really took off as a subject of research in the 1980's with the work of Ryszard Michalski and others, who considered machine learning approaches for the purpose of overcoming the "knowledge acquisition bottleneck", i.e. the difficulty of hand-crafting production rules for expert systems. Indeed, most of the early machine learning systems were propositional-logic based, i.e. symbolic. And of course, one of the most widely used and well-known type of machine learning approach used to this day, decision trees, hail from that time and also learn propositional logic, symbolic models.

Of course, most people today know "ML" as a byword for deep learning, or at best statistical pattern recognition (if that). It's just another aberration brought on by the sudden explosion of interest in a once small field.

I refer you to Michalski's textbook, "Machine Learning: An Artificial Intelligence Approach" and Tom Mitchell's "Machine Learning" for more information on the early days of the field, i.e. until ca. 2000.

wirrbelonFeb 26, 2018

> This makes a good case for not learning machine learning three years ago.

Have you looked at the monograph at all? This could have been written 5 to 10 years ago, 15 years ago, maybe a few sections would have looked a little different. In fact, I like to recommend Mitchell's Machine Learning book (I think it was written in the 90s) as an introduction to people with a serious interest.

There currently is a lot of hype going on for machine learning algorithms, because we see good progress in things like computer vision / pattern recognition. This kicks of a marketing machinery that really blurs the reality.

In reality, we have an established body of methods and modelling techniques that are sufficient, because the available data is the bottleneck for prediction quality. The actual challenge is to come up with a valuable business proposition, not necessarily to build the predictive model.

martingoodsononJune 3, 2013

Pattern Recognition and Machine Learning by Christopher M. Bishop

Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series) by Kevin Murphy

zumdaonJuly 17, 2012

I took the Machine Learning course by Andrew Ng some time ago. And it was a really great experience.

The lessons are interesting, and I really like their concept of quizzes in the middle of the lessons. They just show the question on the video, he explains it, and they stop the video and you can choose which one is right, or type in the number.

The homework is also well done and enhances the knowledge from the course. I had a lot of fun doing the lectures each and every week. You probably want to put aside a set day and time when you will do it, though, or else it is easy to do other things and "do the lectures later".

If they continue like this, there is soon no reason to go to a university anymore. :) (which is a catch-22 since the lecturers are payed by universities...)

allthingonApr 21, 2019

I believe the universal approximation theorem is for a single hidden layer. When more layers are added arbitrary functions can be approximated.

From section 4.6.2 of Tom Mitchell's Machine Learning book:
"Arbitrary functions. Any function can be approximated to arbitrary accuracy by a network with three layers of units (Cybenko 1988)."

karthikmonOct 14, 2009

Another useful resource might be the 20 part lecture series on Machine Learning by Professor Andrew Ng for CS 229 in the Stanford Computer Science department. Youtube playlist URL - http://www.youtube.com/view_play_list?p=A89DCFA6ADACE599

jat850onJan 18, 2013

Definitely. At work today we were watching a video on Machine Learning by Andrew Ng and he made a number of references to LeCun's involvement the field, and listed him in the acknowledgements section of his talk. It has encouraged me to look into more of his areas of work.

I signed up for Andrew's ML course (https://class.coursera.org/ml/lecture/preview , start date is not announced yet). Really looking forward to it.

jezclaremuruganonNov 17, 2011

I'm currently doing Machine Learning by Prof. Ng. They are totally free. You'll watch videos of lectures (not lectures in a class, lectures are made specially for this). There are weekly quizzes based on the classes, and programming assignments. I don't think places are limited but, they don't allow people to sign up after the last date.

jwponApr 3, 2012

What he said! I'd like to add that a key to squeezing more out of NathanRice's post is the phrase "conjugate prior." Another totally natural thing would be to use a Gaussian prior & likelihood, then update the posterior as ratings arrive. This would take advantage of the ordinality of ratings as NR suggests. Bishop's Machine Learning book goes into this sort of stuff in more depth.

lovelearningonJune 28, 2015

I started with Andrew Ng's Machine Learning course on Coursera [1]. He presents the entire subject, including NNs and prerequisite theory, in a non-intimidating, intuitive fashion. There are simple coding exercises, such as digit recognition using NNs.

Once you're familiar with the basics, you can go deeper into the subject with the books suggested here.

[1]: https://www.coursera.org/learn/machine-learning

nikhizzleonMar 3, 2014

Sorry, did not learn much from books. The only one of note is the Tom Mitchell Machine Learning Book, which already requires a basic advanced mathematics fluency. I also believe it is a little out of date.

I can recommend a few professors who really opened my mind to how to use math, all at UCSC (CS grad school for me):

- Dave Helmbold (Machine Learning)
- Kevin Ross (Operations Research)
- Martin Abadi (Security)

waterlesscloudonOct 27, 2015

A new session of Andrew Ng's Machine Learning class (a version of his Stanford class) starts at Coursera on Monday. It's a quite manageable introduction to the field, with some hands-on programming involved. There's just the right mix of math and theory in there, with some refresher material for some of the math.

It's as good a place to start as any, and the benefit of a scheduled class is that you'll have a community doing the same work at the same time to help you out.

https://www.coursera.org/learn/machine-learning/

webhatonJune 28, 2014

In one of the highest followed and most completed course, which happens to be the first Machine Learning by Andrew Ng, the level of completion was a meager 12.5%. And in that Ng's course is an outlier as no other courses with similar sizes, with the exception of AI, have come anywhere close to that level of completion. The numbers actually show the opposite, a large course has a lower percentage of completion.

IMHO if course community were a large factor this high trafficked course with a high level of community discussion would most likely have exceeded this. As you say later ML courses have had much community discussion, yet they have had lower completion rates.

At the moment there is no substitute for real interaction with the docent or professor.

the_real_r2d2onOct 20, 2009

Yes, it is short and basic in theory, but I it is very practical. In my case I learn best trying and applying concepts in practice. That is why I found the book very useful.
Also I accompanied my learning with some other books (i.e. Machine Learning from Tom Mitchell) and academic papers that filled the theory gap.
As a started point to teach the basics of ML and to encourage to go and learn more, I think PCI is very good.

wirthjasononAug 12, 2021

That was a good book. I looked at it recently and it’s now in the 3rd edition! Congrats.

Any other suggestions for good Packt books?

I agree, Packt’s quality is much lower than other publishers. As a rule of thumb I stay away but occasionally there’s a gem.

I’ve been looking at “Machine Learning for Algorithmic Trading”. It feels like a dump of wikipedia and a bunch of jupyter notebooks with sloppy code. I cannot decide if it’s worth the pain if slogging through that mess.

https://www.amazon.com/Machine-Learning-Algorithmic-Trading-...

mindcrimeonAug 9, 2015

I think the biggest thing is that you grok the idea of being "t-shaped" and are taking the initiative to do something. Now, the hardest part is - I think - deciding what to make the vertical bar of your "T". It can probably be almost anything, but some things are going to be more valuable from a career standpoint than others. I chose "java" for that about 2000, after having been focused on C++ for a few years before that. That decision turned out well, but I feel like Java has largely run it's course now. I mean, it'll still be around for a while, but it's not "sexy" anymore.

So, what would I make the new "vertical bar" of my own "T"? I'm leaning towards two somewhat parallel tracks: And both of these involve things that were on my "horizontal bar" going back sometime anyway, it's just time to start emphasizing them more. One, is Semantic Web / Artificial Intelligence / Cognitive Computing stuff. And related, but not necessarily exactly the same is Big Data / Machine Learning / Analytics / BI stuff.

I expect skills in those areas are going to be valuable for some time to come. I started digging into the SemWeb stuff several years ago, actually, but I wish now that I'd started investing more into machine learning a few years earlier.

As far as how to do that? Well, read and experiment and build sample projects for yourself... the same things you'd do to learn any new skill. There are a number of good books and websites out there. I'm reading Machine Learning for Hackers now, and also playing around with things like Mahout, OpenNLP, Giraph, and Spark with GraphX and MLib. I'm also looking into learning R, as it's widely used in that analytics / machine learning world.

mauritsonAug 28, 2011

Of the top of my head,

- Machine Learning by Tom M Mitchell
http://www.cs.cmu.edu/~tom/mlbook.html

For general reading and introductions I also like:

- Pattern Classification by Richard Duda

- Pattern Recognition and Machine Learning by
Christopher Bishop

For a bit more emphasis on statistics and math, I usually dive in to

- Classification,Parameter Estimation and State Estimation by van der Heijden

And last, but certainly not least:

- Information Theory, Inference, and Learning Algorithms by
David MacKay, available here:

http://www.inference.phy.cam.ac.uk/mackay/itila/

gtanionApr 4, 2018

The target audience would be people that have roughly 2 years of undergrad math, the 4 semester calc sequence or high school equivalent, probability/stats, linear algebra, some computational courses using e.g. Numerical Analysis by Burden/Faires.

If you look in Goodfellow et al's Deep Learning book, Murphy's Machine Learning text and others mentioned here (Learning from Data, Shalev-Shwartz/Ben-David) the prereq's are always some variation of above and I think you could do a lot of the above at U.S. community colleges, at least the CC's around me.

Frankly, you'd have to do a bunch of self study beyond CC and there's no shortcut/royal road. So the key is self study, that's a discipline anyone that wants to do Data Science/machine learning for real needs

boniface316onJan 17, 2017

I am taking Machine Learning on coursera by Andrew Ag. Initially I was intimidated by the idea of ML as I had no prior programming experience. I started learning data science stuff less than 6 months ago. I started to feel motivated and confident about ML by taking this course. I highly recommend it.

https://www.coursera.org/learn/machine-learning

zintinio4onNov 18, 2015

Find a problem to work on in a domain you find interesting. By reading published papers and trying to attack the problem, you'll be forced to pick up a lot of other knowledge not commonly discussed like: feature extraction and selection, dimensionality reduction, dealing with sparsity, common metrics for that problem, recent work, etc.

I was forced to learn a massive amount in a short period of time for work, but I'd previously watched Andrew Ng's lectures, as well as majored in Math/CS. I can also generally recommend Hinton's NN lectures, Socher's Deep learning for NLP, Andrew Ng's Machine Learning, and a few books.

samg_onNov 15, 2012

I've taken Andrew Ng's Machine Learning class, Daphne Koller's Probabilistic Graphical Models class, Dan Jurafsky and Christopher Manning's Natural Language Processing class, and currently Geoff Hinton's Neural Networks class.

I have spent a lot of time on Khan Academy to learn the calculus. In my experience you can get by with a surprisingly small amount of calculus, but it happens to be a small amount from a high level.

For example, backpropagation is just repeated application of the chain rule. Did take a while to get a handle on the derivatives, but it's worth it.

sindoconAug 24, 2011

Keywords that I believe would be worth mentioning:

- Lambda Calculus
- Reflective and Meta-programming
- Meta-object Protocol
- Closures
- Continuations
- Monads
- Arrows
- First-class Everything
- Stack and Register-based Programming
- XML
- Linear Algebra
- Fractal and Wavelet Image Compression
- Regular Expressions
- Clojure
- LaTeX

NB. The concepts to which the above keywords refer, may or may not have been covered by the article. The keywords themselves are however absent.

Additional reading suggestions:

- Jon Bentley's Programming Pearls
- Tom Mitchell's Machine Learning
- Douglas Hofstadter's Gödel, Escher, Bach
- Brian Kernighan and Rob Pike's Unix Programming Environment

Built withby tracyhenry

.

Follow me on