HackerNews Readings
40,000 HackerNews book recommendations identified using NLP and deep learning

Scroll down for comments...

The Pragmatic Programmer: 20th Anniversary Edition, 2nd Edition: Your Journey to Mastery

David Thomas, Andrew Hunt, et al.

4.8 on Amazon

396 HN comments

Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture

David Kushner, Wil Wheaton, et al.

4.8 on Amazon

262 HN comments

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems

Martin Kleppmann

4.8 on Amazon

241 HN comments

Clean Code: A Handbook of Agile Software Craftsmanship

Robert C. Martin

4.7 on Amazon

232 HN comments

Code: The Hidden Language of Computer Hardware and Software

Charles Petzold

4.6 on Amazon

186 HN comments

Cracking the Coding Interview: 189 Programming Questions and Solutions

Gayle Laakmann McDowell

4.7 on Amazon

180 HN comments

The Soul of A New Machine

Tracy Kidder

4.6 on Amazon

177 HN comments

Refactoring: Improving the Design of Existing Code (2nd Edition) (Addison-Wesley Signature Series (Fowler))

Martin Fowler

4.7 on Amazon

116 HN comments

Thinking in Systems: A Primer

Donella H. Meadows and Diana Wright

4.6 on Amazon

104 HN comments

Superintelligence: Paths, Dangers, Strategies

Nick Bostrom, Napoleon Ryan, et al.

4.4 on Amazon

90 HN comments

The Idea Factory: Bell Labs and the Great Age of American Innovation

Jon Gertner

4.6 on Amazon

85 HN comments

Effective Java

Joshua Bloch

4.8 on Amazon

84 HN comments

Domain-Driven Design: Tackling Complexity in the Heart of Software

Eric Evans

4.6 on Amazon

83 HN comments

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

Cathy O'Neil

4.5 on Amazon

75 HN comments

A Philosophy of Software Design

John Ousterhout

4.4 on Amazon

74 HN comments

Prev Page 1/16 Next
Sorted by relevance

cikonSep 15, 2020

In every conversation like this I always bring up Cathy O'Neil's excellent Weapons of Math Destruction (https://www.bookdepository.com/Weapons-of-Math-Destruction-C...). I think it's an ethical must read for anyone in tech.

michaelcampbellonSep 1, 2020

"Weapons of Math Destruction" by Cathy O'Neil has a section devoted to this sort of thing. Good read, IMO.

molestrangleronSep 1, 2017

Go and read "Weapons of Math Destruction" by Cathy O’Neil.

This has an excellent chapter of the use of ATS software the majority of recruiters use these days.

https://en.wikipedia.org/wiki/Applicant_tracking_system

medecauonMar 7, 2019

For those interested in learning more about these problems there are two interesting book to read.
"Technicaly Wrong" and "Weapons of Math Destruction"

mattbk1onMay 12, 2020

- "Weapons of Math Destruction" by Cathy O'Neil is a good one.

- "Immersion" by Abbie Gascho Landis

- "Welcome to the Goddamn Ice Cube" by Blair Braverman

- "Sleeping Naked is Green" by Vanessa Farquharson

- "There's No Such Thing as Bad Weather" by Linda Akeson McGurk

WillPostForFoodonJune 23, 2017

Calling BS on big data is really important, but this article is weak. The New Yorker should be doing better. Try Weapons of Math Destruction by Cathy O'Neill for a much more informed critique.

https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

cikonMar 4, 2019

Cathy O'Neill wrote a terrific book a couple of years back called 'Weapons of Math Destruction'. It's a must read for those of us in this field - and generally a great eye opener.

It would behoove us all to think about the consequences of the algorithms we're working on - much like this article.

murtalionDec 21, 2017

Cathy O'Neil wrote a book called "Weapons of Math Destruction" -- interesting read.

You can listen to an interview she does on econtalk -- interesting to learn more about the hidden biases.

http://www.econtalk.org/archives/2016/10/cathy_oneil_on_1.ht...

brianjoseffonNov 11, 2017

Highly recommend you check out the book "Weapons of Math Destruction" by Cathy O'Neil

Very accessible set of case studies where algorithmic management of traditionally human-run bureaucratic processes is biased and destructive.

WhyIsItLikeThisonJuly 24, 2020

Here is my review for, Weapons of Math Destruction: https://www.audiobookreviews.com/single_post.php?id=634

One of the absolute WORST books I have ever read.

333conMar 3, 2020

It's not clear from your comment whether or not you're familiar with the book Weapons of Math Destruction by Cathy O'Neil, but you echo many of its main arguments about these types of systems. I can recommend the book if you haven't read it.

petreonFeb 18, 2018

There's a book about this called Weapons of Math Destruction by Cathy O'Neil. Yes, it's legal. The more opaque the model and the algorithms are, the more they fuel discrimination.

JtsummersonOct 10, 2018

Weapons of Math Destruction offers good discussion on this problem, if you want to read more about it.

cpetersoonJune 10, 2020

Mathematician Cathy O'Neil's book "Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy" is a good introduction to the implicit biases in machine learning:

https://en.wikipedia.org/wiki/Weapons_of_Math_Destruction

michaelcampbellonSep 3, 2020

Something along these lines graced HN's front page recently, but I'll repeat my comment from there: Read "Weapons of Math Destruction" by Cathy O'Neil. Truly frightening.

lelandgauntonDec 19, 2017

This is a good move. I hope other states follow suit. I highly recommend reading Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

majewskyonSep 19, 2017

> Cathy O'Neil in her book 'Weapons of Math Destruction'

For those of you who like listening more than reading, she was on 99% Invisible talking about destructive algorithms and her book a few weeks ago: http://99percentinvisible.org/episode/the-age-of-the-algorit...

icebrainingonDec 22, 2017

The book Weapons of Math Destruction raises similar questions - not just perpetuating racism, but also other discrimination (such as against the poor). There was a previous discussion on HN regarding an interview with the author: https://news.ycombinator.com/item?id=12642432

michaelcampbellonOct 23, 2016

I'm halfway through "Weapons of Math Destruction", and this sort of thing is really, really scary. If the amount of damage done in the US already with big data models, with a much narrower focus, for smaller consequences, and ostensibly "good" purposes is any indication, this will be incredibly harmful.

rspeeronMay 30, 2017

I suggest her book "Weapons of Math Destruction" in particular.

petreonJan 1, 2017

They took a good idea (welness programs) and made a creepy dystopian thing out of it (biometric tracking for penalizing you on health insurance). That's what Weapons of Math Destruction is all about: tracking, data mining/collection, algorithms that discriminate people based on the data collected. This article is essentialy a advert for the book. I've read some of it in a library.

nerdkid93onMar 3, 2021

"Weapons of Math Destruction" was our textbook for CS 6603 at Georgia Tech. It was a fascinating/terrifying look into some of the systems that seem to silently impact the lives of millions in the U.S. Cannot recommend it enough.

thedailymailonSep 25, 2018

The book "Weapons of Math Destruction" by Cathy O'Neill has plenty of other examples of how profiling algorithms often end up disadvantaging the disadvantaged.

quotemstronAug 18, 2018

In some situations, "perfect is the enemy of the good" is a reasonable approach. Unfortunately, we're not dealing with a good-but-imperfect filter, but a practically useless filter, one that imposes significant costs on top of being ineffective.

"Weapons of Math Destruction" presents good evidence that 90% is much lower than the actual effectiveness of X.

BlaaguuuonJan 29, 2018

I'm in the middle of reading a pretty interesting book, where that is one of the core arguments - Weapons of Math Destruction, by Cathy O'Neil - I recommend checking it out if you are curious to learn more about the distinction that you made, and more of how we tend to abuse math through algorithms that are in some way designed by humans.

pooya72onApr 18, 2017

Yeah, there's a good book on this very issue called "Weapons of Math Destruction."[1] The author did an interview on a podcast and discusses how algorithms used by these judges reflect our racial biases.[2] One example that I remember is that the algorithm takes into consideration one's zip code, which correlates strongly with race in the US.

[1]: https://weaponsofmathdestructionbook.com/
[2]: http://www.econtalk.org/archives/2016/10/cathy_oneil_on_1.ht...

kzografonApr 3, 2019

Hi, creator of this project here.

I am a software product manager with background in programming. I am creating #BooksByWomen at thebooksbywomen.com, curating a list of books written by women, especially in the tech/business category as a side project. You could head out to https://www.producthunt.com/makers-festival-2019/voting#tiny... and vote if you like it.

Made with:
Sheet2site.com.com
The love and help from the communities at WomenMake, Women in Product, Tech Ladies and a lot of my friends.

I read Cathy O’Neils book on algorithms: Weapons of Math Destruction, I absolutely loved it. It resonated with me on a lot of insights I was seeing myself in my career. So I paid attention to women authors in my network and there are so many that never would have reached me through traditional marketing and social media if I hadn’t sought them out specifically. This is why I am curating and maintaining this list and want it to succeed.

What I would ideally want to do:
* Create a page to honor my supporters
* Publish regular bi-weekly interview with women authors about their journey
* Publish featured books based on common themes, coming up
* Create a browser extension where this content can be shared effectively
I do hold a fulltime job, so carving out time would be a challenge, any support helps.
You can also support me most directly by visiting the site and buying some books from awesome female authors, I make a small affiliate commission from Amazon.

You can also support me by buying me coffee or ramen on Patreon: https://www.patreon.com/kalinaz

clumsysmurfonAug 20, 2016

Yet another book out recently which explores this topic:

"Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy"

https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

kevinSuttleonDec 22, 2016

  Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (Cathy O'Neil)

Whoa. Need to check that one out. Thanks.

lazyasciiartonMay 27, 2020

"a bit sci-fi"? What does that mean? We allow AI systems to decide who goes to jail in some places - I'd say that's a clear enough sign that the field needs people thinking all the way through their work.

There's some good entry level books on the topic, like "Weapons of Math Destruction" that you might enjoy.

flotheronMar 1, 2018

And Cathy O’Neil's book Weapons of Math Destruction, mentioned in the episode, is also well worth reading.

https://www.amazon.co.uk/dp/0141985410

bem94onSep 19, 2017

I couldn't agree more. Engineers are often so far (physically and metaphorically) from the actual impact of what they work on, it can be really easy to miss the damage it does.

People who work for Facebook must have to work very hard to remember the people they are experimenting on are people, not just "users", and that they don't have cart-blanch with how they treat them. Especially if they are building addiction into their service!

Also, Weapons of Math Destruction is a fantastic book.

MagicAndionSep 19, 2017

Interesting article, has similar concerns to those raised by Cathy O'Neil in her book 'Weapons of Math Destruction', and also by Tim Wu in The Attention Merchants. Those of us working in software rarely stop to think about the biases we're hardcoding in our programs, and how it can impact users. We really need to wake up to how we are affecting people's lives.

denzil_correaonOct 18, 2017

> In other words, by the time we notice something troubling, it could already be too late.

For me, this is the key motivating point - the horse may have left the barn by the time we act. A lot of times people say this is exaggeration but "Weapons of Math Destruction" is a nice read on unintended side effects of this phenomena [0].

[0] https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

a_bonoboonMay 23, 2018

Some books I enjoyed and which could be beneficial:

- Borges' Aleph, and his essays (for example Other Inquisitions). He was fascinated about concepts programming is famous for now, he would have loved recursion

- Mann's The Wizard And The Prophet. He uses the life of two highly influential scientists (Borlaug and Vogt) as exemplars for two ways of viewing progress, one highly focused on technological progress, one being sceptical of technological progress

- Wachter-Boettcher's Technically Wrong, about implicit biases in modern technology and how that excludes or mistreats people. Some good lessons if you want to start a company and get as many customers as possible ;) Similar book with more academic stringency: Weapons Of Maths Destruction

- Roman's Writing That Works, non-fiction/memo/email advice from big advertising guys, lots of good advice on getting your point across

- Fromm's The Art Of Loving & The Sane Society, two non-fiction books from a sociologist/psychologist on how to work on your relationships (i.e., love as a movie concept doesn't exist, it's mostly very hard work and self-critique), and how society as a whole has very broken goals. Becker's The Denial Of Death (on how fear of death is a major drive in life) comes from a similar place

- Wilson's How To Teach Programming - delves deeply into the psychology of learning and how to build stable communities, it's available for free so why not

- Statistics books are always good to mend your thinking! My favorite layman's introduction is Motulsky's Intuitive Biostatistics (no formulas, plain English), for a non-practitioner Wheelan's Naked Statistics is probably better

- Seth Godin wrote lots of good business books, pick one (I liked Linchpin)

- If you haven't read them yet, get the 'classics' of software management: The Mythical Man-Month, and Facts and Fallacies of Software Engineering

- Cuckoo's Egg is a marvellously fun non-fiction book about a programmer tracking a hacker in the 80s, one of the first international computer crime cases, featuring tons of fun low-tech hacks

AreibmanonJuly 24, 2020

It's definitely one of the least technical ones in the list, but "Weapons of Math Destruction" is a must read for anyone deploying models that directly affect users. Its general point is that opaque judgement systems, such as how social media platforms these days seem to arbitrarily remove content, are only going to get worse with black box models. AI ethics are seldom taught, and this one provides a good framework for thinking about how to build effective, yet fair systems.

LowkeylokionApr 6, 2019

You're welcome. I can't really recommend any specific books. I don't want my personal views on politics to be the focus, though. I'm just wary of biases hidden in supposedly unbiased algorithms.

If that topic interests you, I can recommend Weapons of Math Destruction by Cathy O'Neil, and Algorithms of Oppression by Safiya Noble.

I'm a programmer myself, so I understand if people are initially defensive about these topics. But it's important to me that I never write software that's harmful to people, so I try to check my biases as I create software to avoid these more insidious issues. It's easier to spot issues that may arise due to programming machinery that could crush a person, for example. Or software designed to estimate fuel usage for an airplane. It's much more difficult to spot inequity of minorities in software.

There's plenty of examples out there of biased AI in response to biased training sets.

If you're more interested in the dystopia that must be created in order to "win" SimCity, here's a couple of interesting articles. (I think they've been shared here before.) By taking the game to its absolute extremes, it becomes clearer what the simulation considers most important and valuable, thereby laying bare some of the biases inherent in the game's design.

https://rumorsontheinternets.org/2010/10/14/magnasanti-the-l...

https://www.vice.com/en_us/article/4w4kg3/the-totalitarian-b...

red_blobsonSep 23, 2016

Remember when memes were lauded as a part of the millennial culture? Now when they are used for something you disagree with, it's seen as 'evil', 'wrong', or 'unethical'.

The problem is that the vast majority of voters are swayed by emotion, not fact.

These sorts of things are only going to get worse, because it is the only way to get people to vote your way. Even if you have a good idea and a sound plan, you need to dress it up in emotion-laced slop to get people to come out and vote for you.

It also doesn't help that the mainstream media, which is a very powerful force in the US when it comes to politics, is biased toward the Democratic party (as seen in the recent Wikileaks emails from the DNC). It means that to counteract this, you need to try another tactic, like posting on the Internet.

Even our leaders are swayed by emotion. Both Obama and Hillary have commented prematurely on important events ('Clockboy' and various police shootings) without having professionals and science weigh in on the actual facts of the events after a real investigation.

This is one of the main problems with our society today: anti-science winning out over facts and assuming someone is guilty before even attempting to see if they are innocent.

There was even a book on the New York best sellers list called 'Weapons of Math Destruction' claiming that math and statistics are somehow 'racist'. Think about that for a minute to let it sink in....Facts are now racist.

Social media has made it worse because instead of just having the mainstream media feed us hyperbole and rhetoric, anybody with a Twitter account can do it too.

It has now had some real-world consequences and resulted in many people getting hurt and even getting killed in riots over half-truths, hearsay, and rumors.

If you want shit posting to stop, we have to live in a society where it has to stop working so well. Maybe even holding people responsible for posting lies that lead to riots or death.

Edit: sigh. I always try to have intellectual conversations here on HN and am always disappointed. Most people here seem to just want to hear the current San Francisco narrative about the world and live only within that bubble. It's actually really sad.

DowwieonNov 21, 2016

For more information about "ethics and algorithms", read: "Weapons of Math Destruction" [1] or at least listen to the EconTalk podcast between the author and host Russ Roberts [2]

[1] https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

[2] http://www.econtalk.org/archives/2016/10/cathy_oneil_on_1.ht...

cpetersoonNov 14, 2018

For more on the consequences of big data and algorithms on humans, check out mathematician Cathy O'Neill's book "Weapons of Math Destruction". She describes problematic systems that are opaque, difficult to contest, and scalable, thereby amplifying inherent biases to affect increasingly larger populations.

https://en.m.wikipedia.org/wiki/Weapons_of_Math_Destruction

huntermeyeronJuly 7, 2020

Reminds me of a chapter in the book, "Weapons of Math Destruction"[1].

[1] https://en.wikipedia.org/wiki/Weapons_of_Math_Destruction

gajjanagonMay 1, 2017

The issue raised by this article is discussed in a broader context in the excellent book by Cathy O'Neil: "Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy".
The dangers of proprietary, secret algorithms making judgements at critical junctures (e.g whether a person is sentenced or not) is raised in the introduction of the book.

The book also gives copious concrete examples of these dangers. In particular, the book describes the LSI-R (Level of Service Inventory-Revised) questionnaire and how it effectively pinpoints race even though it does not actually ask for the person's race, which is illegal.

medecauonMay 29, 2018

The book "Weapons of Math Destruction" goes into some detail on how these models are being used to exploit people when in some cases they could be used to help them.

https://en.wikipedia.org/wiki/Weapons_of_Math_Destruction

rspeeronAug 24, 2017

US News still publishes rankings, that's why every college has three Vice-Chancellors of Gaming the US News Rankings.

(See Cathy O'Neil's "Weapons of Math Destruction", where she blames US News rankings specifically for the bloat of universities and the overuse of adjuncts.)

reureuonSep 22, 2019

If you're interested in this topic, you should look at "Weapons of Math Destruction."

It's a question of if the outcomes are distributed differently among races because the races represent truly different probabilities, or because the inputs to the model were already tainted against some race. For example, let's say you judge creditworthiness based on membership in a particular honors society, but that honors society doesn't exist at historically black colleges.

acjohnson55onJune 23, 2018

The book Weapons of Math Destruction talks all about this. I've come to believe that pure risk shouldn't be the only factor in a person's interest rate.

That obviously makes a ton of sense from a business standpoint. You want to contain losses for risky borrowers but compete with other lenders for low risk borrowers.

But socially, this is perverse. People tend to be risky because they are already poor. So now money costs more for those who have the least of it. This is one of the feedback loops that makes poverty (and affluence, for that matter) so sticky.

I had this realization in my personal experience when I was able to refinance almost $100k in student loans at a crazy low interest rate. My household's finances are in great shape as my wife and I enter our prime earning years. But for us, such an opportunity is a gift, on top of an already sweet situation. The savings could be a game changer for a family whose finances are more marginal.

lmkgonOct 17, 2018

For-profit colleges (at least the large ones) are actively predatory in targeting desperate, uneducated people.

Something I learned from the book Weapons of Math Destruction: there's an entire industry around skimming contact information from poor people to target for certain products. People will spend heavily on AdWords advertising for terms like "apply for food stamps," and have a web page that offers the service of expediting your food stamp application for free. Of course, this website is just a form that takes your info and forwards it to the regular online food stamp application. Maybe it's got better UX, but it's not expediting anything.

As part of handling the application, the website operator can save the name, address, and phone number of someone who is highly likely to be poor and desperate. They sell this information to certain companies, predominantly diploma mills and payday loan companies, who will then call these people directly to aggressively market their services.

I don't have the book at my fingertips, but if my memory is right, the contact details for one qualified lead sells for like $75.

NebbersonSep 15, 2018

Highly recommend everyone to read "Weapons of Math Destruction" by Cathy O'Neil. I have tried to stop using services that collect my data as much as possible, and when anyone throws the arguement "you must have something to hide" it's good to have a few real life examples of why everyone should fight back against data collection and information brokers.

rab-the-goatonFeb 28, 2017

Your point hits on a true thing. One problem is that companies measure proxies for performance, not performance itself. A great book on the topic (and related topics) is Weapons of Math Destruction. Anyway, a green checkbox is pretty far into proxie-land. It's not very closely related to client retention or profitability, and now we see it's not even related to operational time of the equipment. Yikes. So a proxy like this is not even worth using as a metric; it can only cause false confidence that some information is known, and that leads to bad decisions. Not the least of which is bonusing incompetent managers.

cpetersoonOct 14, 2018

For examples of the consequences of big data and ML, check out mathematician Cathy O'Neill's book "Weapons of Math Destruction". She posits that the problematic systems share three key features: they are opaque, unregulated and difficult to contest, and at the same time scalable, thereby amplifying any inherent biases to affect increasingly larger populations.

https://en.m.wikipedia.org/wiki/Weapons_of_Math_Destruction

unethical_banonJan 23, 2019

I would recommend that anyone interested in Machine Learning, or "predictive modeling" etc. should read the book "Weapons of Math Destruction".

https://weaponsofmathdestructionbook.com/

I don't have my notes with me and I'm only 1/3 through, but the main theme is that the best predictive algorithms:

* work transparently for all parties (the creators, users, and "inputs", often people).

* Have no feedback loop (The use of data from the model should not further entrench the output of the model).

And a few others. It gets into discrimination and other major flaws of data modeling re: recidivism, school admissions, stock trading, and other things.

Not all algos are racist - but there are definite attributes to avoid, and this book (or a more rigorous version) should be mandatory reading for all "data scientists".

harrumphonDec 21, 2017

>Cathy O'Neil wrote a book called "Weapons of Math Destruction" -- interesting read

+1000. That book should be required reading for anyone working in machine learning. Written by a former Wall Street quant who has the math down cold.

What she knows about rampant bias in allegedly politically agnostic machine learning circles is that the formulation and production of answers is trivial when compared to the formulation and production of questions.

Super-relevant to this thread is her work on recidivism risk scoring algos run on prisoners and defendants. The feedback loops that these algos spur are seriously damaging the lives of huge numbers of persons in the criminal justice system far beyond proportionality for the offenses that brought them there.

jameaneonJan 17, 2019

But do not disregard the other advantages you may have to help you pay off that debt in 3 years:
being married so you can share costs
access to well paying jobs
presumably direct or indirect family support (e.g. being able to live with family cheaply so your income can be directed to debt payments)

We have told people, particularly people with low income or disadvantaged backgrounds, that college is the ticket to a high paying job. And then for-profit and other sketchy institutions prey on these people to get them to take out loans for ow value degrees. That probably aren't even completed.

The chapter in "Weapons of Math Destruction" on how for-profit colleges use "big data" to prey on low income people is terrifying. They look for people getting government assistance and other indicators of "struggle" and use facebook retargeting to bombard them with ads on why they need to go to college at a for profit institution.

You kind a have to luck out on a lot of parameters to easily pay off student loan debt. Even Barack Obama couldn't pay off his student loan debt until recently.

wiglaf1979onOct 15, 2019

Two books that dived well into this subject. Unlike this article they have a bit more meat to their research. I would recommend giving them a read before getting your pitchforks only due to this article. It's bad and needs to be fixed but a measured response instead of a purely reactionary response will just make things worse.

Weapons of Math Destruction
https://www.goodreads.com/book/show/28186015-weapons-of-math...

Automating Inequality https://www.goodreads.com/en/book/show/34964830-automating-i...

noiszytechonMar 31, 2017

This is an awesome (and thorough) response and a great idea. I totally agree about the arms race and how AI + crowdsourced data could be applied to create much more realistic fake viewing patterns. I'm super-glad that this conversation is taking place.

But, wouldn't collecting viewing habits and then using AI to define (and emulate) real-looking behavior immediately put the developer(s) in that moral grey area that so many algorithms occupy? Technically it could be done, and it would be fascinating to work on, but we'd have to start with a huge browsing dataset (creepy) and then process it to figure out the patterns (exactly what this tool is trying to subvert), and then feed that back as output from within the user's browser (probably feeding back indistinguishable-but-AI-driven data and creating a loop). It's a murky space to wade into, and one that needs a lot more conversation.

Instead I decided to just keep it simple. The first page is chosen randomly from a list of (user-approved) sites. A link on that page is chosen randomly from the list of links that open in the same window and point to the same domain, and that's clicked. That's repeated a somewhat-random number of times, usually about 2-7 times, before a new site is chosen from the user-approved list, and the process starts over.

Check out Cathy O'Neil's definition of "Weapons of Math Destruction" (good overview of her book here: http://money.cnn.com/2016/09/06/technology/weapons-of-math-d...) - I'd love to hear your thoughts on that framework for determining the morality of algorithms.

corodraonNov 30, 2019

I did say "if I agreed/believed it". But if it were true, that's where I believe the discourse would end up.

But to be fair, PR stunt dynamics are better understood these past two decades compared to the 80s. The last decade alone, you can see the reaction of the public to news instantly, react to it and see the public reaction-reaction instantly, with well formatted analytics. That's the only reason I don't immediately dismiss the idea like it was a flat-earther conspiracy theory. There's plenty of analytics to try making a solid plan to do something like that. At the same time, I have zero proof that it's ever been done intentionally.

There's a book called Weapons of Math-destruction. In it the author talks about Facebook's potential ability of swaying public discourse and increasing the chances of certain votes using their analytics knowledge. While the author said it's a scary thought, it could never happen and there's no evidence it has been done or would be done. This book came out 2016 (books are typically published about 6-12 months after manuscript is accepted, so written roughly 2015). So... yea.

dredmorbiusonOct 8, 2019

The case for automated discrimination is made very well by Cathy O'Neil in Weapons of Math Destruction (https://www.worldcat.org/title/weapons-of-math-destruction-h...). I recommend it strongly.

There are several cases of problems. One is of course intentional discrimination. The larger problems are likely either a failure to care, consider, or attend to problems, or most insidious, side effects which arise entirely unintentionally.

The fact that much gradient descent machine learning is opaque to explanation means that such AI essentially becomes a new form of knowledge: like science, it provides answers, but unlike traditional Baconian scientific methods, it doesn't answer why or how, and fails to provide cause or mechanism.

Given use in increasingly complex, large-scale systems, without ready human review or oversight, this creates conditions for numerous unfortunate probably consequences.

gjulianmonApr 11, 2021

The book "Weapons of math destruction" contains a myriad of real examples of big data and AI punishing people and making their situation worse, in the US, with no totalitarianism needed. For example, use of AI in credit scores or policing, where the place you live will make the AI score you worse or think you're more likely to be a criminal, therefore making it harder for you to get out of any bad situation and reinforcing the "people from this zone are bad" idea.

chonglionDec 9, 2019

If the case you're making is that you think that there should be a national effort to correct for historical injustices that were done by the state by actively discriminating by race, that is a completely different discussion.

That is what proponents of the structural racism model are doing. Here's an example I took from the book Weapons of Math Destruction:

When people are convicted of a crime, they undergo a number of personality tests, including the LSI-R (Level of Service Inventory - Revised). This is a highly detailed questionnaire that asks about prior convictions, whether the prisoner had accomplices in their crimes, whether drugs or alcohol were involved, etc.

It does not ask about race.

What it does ask about are things which highly correlate with race, such as the number of police encounters (no criminal suspicion necessary), the number of friends/family/neighbours who have committed crimes, etc. If two first-time offenders have committed identical crimes but one of them grew up in wealthy suburbs and the other grew up in the rough inner city, they will receive very different scores on the LSI-R.

So what do they use the LSI-R for? They feed it into a model which assigns the offender a recidivism risk score. Then they use that risk factor directly when determining the person's sentence, restrictions, parole eligibility, etc.

So now we're not even talking about historical injustices, we're talking about ongoing injustice based on historical injustice. It's a vicious cycle, or a negative feedback loop, if you will. This is a serious problem!

Edit: Just to add another piece of the puzzle, the reason wealthy suburbs vs rough inner cities correlate so highly with race is a direct result of the historical racist practices of redlining [1] and white flight [2]. Now combine that with grinding poverty (also a result of redlining and segregation) and the war on drugs, and the result is high-crime neighbourhoods in the inner city. Those high crime neighbourhoods attract highly increased police presence, which leads to more convictions, which leads to more patrols, etc. This is another vicious cycle which feeds into the above statistical model.

[1] https://en.wikipedia.org/wiki/Redlining

[2] https://en.wikipedia.org/wiki/White_flight

maym86onJuly 31, 2018

No thanks. Bail + blockchain + AI is still just bail. The point is this is still based on proving you or your family's ability to pay a bail with added "AI" which is notoriously bad for this kind of thing due to historical training data being full of bias.

The book weapons of math destruction goes into the issues with machine learning and the prison system.

https://weaponsofmathdestructionbook.com

> Families apply for bail on the platform.

The fact that you focus on a family's ability to pay just highlights one way in which the system is biased against people without family support.

bostikonJan 8, 2021

For quite a bit longer, I think.

The book Weapons of Math Destruction was published in 2016, and the conclusions therein were not exactly new at the time either.[ß] The business model of ad-based social media is focused on generating engagement, above all else.

Nothing generates more engagement than outrage, fear, or playing to primal instincts. To satisfy that demand, content providers are encouraged to generate the maximal outrage and foment fear. Social media platforms are incentivised to direct people towards that content, because they make more money that way.

The end result: social media platforms automate pile-on and radicalisation at the speed of lies, at global scale. Not because they are inherently evil, but because they are making money off of people being inherently terrible.

ß: The real meat of the book is in the first two chapters, as far as I'm concerned. Everything else in there is just repeating the arguments.

SwizeconJune 19, 2018

> I realized that there is no pleading or reasoning with The Machine.

Weapons of Math Destruction was one of the most eye opening books I read recently. It's all about how Machines when set up with self-reinforcing models can become a real big problem.

And how the fact nobody quite understands how The Machine makes decisions isn't helping either.

barrkelonJuly 24, 2017

Are you familiar with correlation vs causation? Without a causal link, making an inference based on correlation is unsound. Encoding unsound reasoning in an AI model, particularly when it reinforces an existing social imbalance, would be less than ethical.

"Weapons of Math Destruction" by Cathy O'Neil delves into this in more depth. It's a very valid concern, particularly in the way non-technical people are trained over time to give deference to the algorithm.

chonglionOct 15, 2019

I recall reading a story in the book Weapons of Math Destruction [1] about a system used to assess people’s recidivism risk which judges were relying on for sentencing hearings. The problem is that the system showed a clear racial disparity in the risk scores yet at the same time it was more accurate.

That means if we try to tune the system to make it less racist, we’ll be making it less accurate. In essence, the system isn’t really racist, it’s a reflection of the racism in society which is leading to these outcomes. Ultimately, the problem is that putting someone in jail increases the likelihood that their relatives will commit crimes. It increases the likelihood that both they and their friends and family will reoffend. It’s a vicious cycle and it doesn’t appear to have any technical solution.

[1] https://en.wikipedia.org/wiki/Weapons_of_Math_Destruction

caresource_taonJan 30, 2018

This doesn't seem like the case but Cathy O'Neil (the author of Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy) raises the point that we often see algorithms implemented in non-traditional ways when society faces tough questions. It makes me really hesitant about using algorithms as silver bullets to solve hard problems in society.

nlonNov 18, 2016

I thought this was a joke when I read the abstract, but it appears to be a genuine paper.

This paragraph in particular is one of the worst examples I've ever seen of researchers NOT UNDERSTANDING WHAT THEY ARE DOING:

Unlike a human examiner/judge, a computer vision algorithm or classifier has absolutely no subjective baggages, having no emotions, no biases whatsoever due to past experience, race, religion, political doctrine, gender, age, etc., no mental fatigue, no preconditioning of a bad sleep or meal. The automated inference on criminality eliminates the variable of meta-accuracy (the competence of the human judge/examiner) all together.

Please, read Weapons of Math Destruction and understand how excellent machine learning is at discovering and exploiting the biases in datasets.

Edit, no, sorry, it gets worse:

the upper lip curvature is on average 23.4% larger for criminals than for non-criminals.

the distance d between two eye inner corners for criminals is slightly shorter (5.6%) than for non-criminals

SavantIdiotonMar 3, 2021

> Does that not depend on your perspective?

Well, it depends on your breadth of understanding of the issue. If you have a limited scope, it doesn't seem like a big problem. And it also assumes that we all agree on certain basic human rights. If we don't then yes, it is a matter of perspective.

Read "Weapons of Math Destruction". It discusses how existing algorithms discriminate in the following case-studies:

- courtroom sentencing (and recidivism prediction)

- mortgage and loan rate determination

- educator performance

- job applications that use 3rd party screening tools

The book looks at specific examples of where these black-box algorithms are deployed that have ruined people's lives with no accountability. The ethics concerns that are being raised were needed a decade ago, or more, but it is only getting worse with the ad-hoc deployment of this un-baked technology to almost every industry.

I have to stress again that these black-box algorithms are ALREADY IN USE and cannot be subpoenaed by courts because it is considered intellectual property.

So yes, it is urgent to crack open this technology because it is all too easily being sold without any investigation, accountability, or thought to the consequences. "Move fast and break things" doesn't work if it ruins peoples lives by landing them in jail, or pushing them into poverty.

The book demonstrates this is not a what-if strawman, but reality.

JtsummersonMar 23, 2021

The data usually has clear biases present against certain ethnic groups and economic classes. Also you have to look into which laws are broken and feed into the data (again, reflects back on the first sentence). If jaywalking and other minor crimes go into the prediction algorithms, are those crimes treated equally throughout the area and population? Is it really the case that there's no jaywalking in the middle class neighborhoods or is it just that the police only apply it in the poor neighborhoods? This creates a bias in patrols where they step up in areas with more charges, which makes sense on the surface until you examine which areas those are and why they have more charges in that area or amongst that population.

For a fuller treatment on this I recommend Weapons of Math Destruction by Cathy O'Neil (https://www.amazon.com/Weapons-Math-Destruction-Increases-In...).

ThrowItAway2DayonSep 26, 2019

There's a reason that the majority of student loan debt in the US is from for profit colleges [1].

They are predatory, ill-regulated, and unstable. ITT Tech, WGU, and those all algorithmicly target audiences that are likely to get approved for Federal student loans: veterans, single-mothers, first-in-family to go to college. They get them approved for these loans and the funds go directly to the school whether they graduate or not. There's a very interesting chapter on this in the book "Weapons of Math Destruction." Don't believe their non-sense ads about being the future of education, the incentives just don't align. Go to local community or state public schools if private is not affordable. Either way, both public and private are not-for-profit.

[1] https://phys.org/news/2019-06-for-profit-america-student-deb...

acdhaonSep 24, 2016

> There was even a book on the New York best sellers list called 'Weapons of Math Destruction' claiming that math and statistics are somehow 'racist'. Think about that for a minute to let it sink in....Facts are now racist.

I don't know where you read that but you really, really, really need to learn about that book before repeating it, much less complaining about the lack of intellectual conversation.

Cathy O'Neil has a Ph.D in mathematics (https://www.genealogy.math.ndsu.nodak.edu/id.php?id=38230), worked as a quant, and most definitely is not claiming that math is racist. Rather, she's talking about how MISUSING math – and especially machine learning – can reinforce biases which were already present or introduced by sampling error. She's actually calling for greater mathematical understanding so people keep these things in mind and avoid them:

http://www.npr.org/2016/09/12/493654950/weapons-of-math-dest...

https://mathbabe.org/2014/08/12/weapon-of-math-destruction-r...

dredmorbiusonFeb 13, 2021

Cathy O'Neil's done some excellent work in this area. "Algorithms are opinions embedded in code."

There's a summary of it on the TED Radio Hour here:

https://www.npr.org/2018/01/26/580617998/cathy-oneil-do-algo...

TED Talk https://embed-ssl.ted.com/talks/cathy_o_neil_the_era_of_blin... (video)

Cathy O’Neil Is Unimpressed by Your AI Bias Removal Tool (A RedTail Q&A)
https://redtailmedia.org/2018/10/29/redtail-talks-about-flip...

Weapons of Math Destruction: Cathy O'Neil adds up the damage of algorithms
https://www.theguardian.com/books/2016/oct/27/cathy-oneil-we...

Human Insights missing from Big Data: https://www.ted.com/talks/tricia_wang_the_human_insights_mis...

Further references:

Weapons of Math Detruction outlines dangers of relying on data analytics: https://www.npr.org/2016/09/12/493654950/weapons-of-math-des...

Can Big Data Really Help Screen Immigrants? https://www.npr.org/2017/12/15/571199955/dhs-wants-to-build-...

ubernostrumonAug 13, 2017

The worst part of the information age is arbitration by unreasonable and impenetrable algorithms rather than humans

The thing is, people only tend to notice it when it affects them personally (either they are the victim of the algorithm, or someone they know/like/support is). The world has long worked on irrational biases, which now are being used as the training data for decision-making systems which are subsequently declared to be "objective" because people believe an algorithm can't be biased. And increasingly, the mark of privilege is having access to a system -- applications, interviews, customer service, even courts -- which will use human judgment instead of an unreviewable algorithm.

For more on the topic I suggest the book Weapons of Math Destruction by Cathy O'Neil.

catwellonDec 22, 2016

I finished Keynes, Hayek: The Clash that Defined Modern Economics (Nicholas Wapshott), started in 2015.

I read:

* Turn the Ship Around!: A True Story of Turning Followers into Leaders (David Marquet)

* Joy at Work: A Revolutionary Approach To Fun on the Job (Dennis Bakke)

* Ne vous résignez pas ! (Bruno Le Maire - French politician)

* Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age (Michael Hiltzik)

* Disrupted: My Misadventure in the Start-Up Bubble (Dan Lyons)

* Making Things Happen: Mastering Project Management (Scott Berkun)

* Basic Economics: A Common Sense Guide to the Economy (Thomas Sowell)

* The Success of Open Source (Steve Weber)

* Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (Cathy O'Neil)

* Programming in Lua (fourth edition - I read every edition)

I started reading (and will probably finish by the end of the year) Overcomplicated: Technology at the Limits of Comprehension (Samuel Arbesman).

As for what I recommend, it depends what you are into, but I would say I really enjoyed Making Things Happen, which is a must if you have any kind of project management to do, and Basic Economics.

lmkgonAug 1, 2020

Read Weapons of Math Destruction by Cathy O'Neil. The book explores several ways that's already happening. Her main premise is that there's a feedback loop in many data-driven policies. You only get success results for the things that you try, and you only try the things you already think are likely to receive. As a result, algorithmic policies tend to reinforce the status quo.

Loan risk algorithms will favor people "similar to" those who have paid back loans before, a sample group biased towards people that banks have already loaned to before. As a result, a lot of the factors are biased towards "from a white upper-middle-class suburban background."

And recidivism estimators, which are used as jail sentencing guidelines in some places.

Screening algorithms for job resumes, and college applications.

Algorithms send police to where crimes are reported. Crimes are reported because the police are there to witness them. The area gets designated a high-crime area. Regular people are arrested more often because regular activity is suspicious in a high-crime area, affecting their future prospects. The higher arrest rate is used to justify this.

It's a continuous spectrum rather than a single point. But if I were to pick a single "point" where it became a self-fulling prophecy? 1994, due to the widespread passage of three-strikes laws.

cpetersoonAug 26, 2018

I just started reading Weapons of Math Destruction by mathematician Cathy O'Neil. She warns about big data systems that codify racism and classism from flawed data and self-fulfilling feedback loops. The systems' "unbiased" decisions are opaque, proprietary, and often unchallengeable.

https://en.wikipedia.org/wiki/Weapons_of_Math_Destruction

Built withby tracyhenry

.

Follow me on