Hacker News Books

40,000 HackerNews book recommendations identified using NLP and deep learning

Scroll down for comments...

The Beginning of Infinity: Explanations That Transform the World

David Deutsch, Walter Dixon, et al.

4.6 on Amazon

63 HN comments

Cosmos: A Personal Voyage

Carl Sagan, LeVar Burton, et al.

4.8 on Amazon

63 HN comments

Stumbling on Happiness

Daniel Gilbert

4.3 on Amazon

58 HN comments

A Mind for Numbers: How to Excel at Math and Science (Even If You Flunked Algebra)

Barbara Oakley PhD

4.6 on Amazon

56 HN comments

Molecular Biology of the Cell

Bruce Alberts, Alexander D. Johnson, et al.

4.5 on Amazon

54 HN comments

The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power

Shoshana Zuboff

4.5 on Amazon

46 HN comments

Skunk Works: A Personal Memoir of My Years of Lockheed

Ben R. Rich, Leo Janos, et al.

4.8 on Amazon

46 HN comments

Industrial Society and Its Future: Unabomber Manifesto

Theodore John Kaczynski

4.7 on Amazon

44 HN comments

Chaos: Making a New Science

James Gleick

4.5 on Amazon

44 HN comments

Enlightenment Now: The Case for Reason, Science, Humanism, and Progress

Steven Pinker, Arthur Morey, et al.

4.5 on Amazon

43 HN comments

How to Measure Anything: Finding the Value of Intangibles in Business

Douglas W. Hubbard

4.5 on Amazon

41 HN comments

The Shock Doctrine: The Rise of Disaster Capitalism

Naomi Klein

4.7 on Amazon

40 HN comments

Chaos Monkeys: Obscene Fortune and Random Failure in Silicon Valley

Antonio Garcia Martinez

4.2 on Amazon

40 HN comments

Algorithms to Live By: The Computer Science of Human Decisions

Brian Christian, Tom Griffiths, et al.

4.6 on Amazon

39 HN comments

The Right Stuff

Tom Wolfe, Dennis Quaid, et al.

4.6 on Amazon

37 HN comments

Prev Page 2/14 Next
Sorted by relevance

trapperonFeb 13, 2009

Someone hasn't read the excellent book "how to measure anything".

aalhouronJune 27, 2021

* Thinking Fast and Slow

* The Selfish Gene

* Probability: For the Enthusiastic Beginner

* How to Measure Anything

* Rationality: From AI to Zombies

* Cynefin: Weaving Sense-making into the fabric of our world

* Major works of Friedrich Nietzsche

eigenrickonJune 24, 2014

How to Measure Anything -- By Douglas Hubbard

The successful role models in my life have a keen ability to at least relatively measure large, nebulous things. For thinking in probabilities and ways to gauge intangibles like "effectiveness" I think this book is excellent.

robi-yonJune 6, 2020

Somewhat related but more generally about estimation:
Hubbard, How to Measure Anything
https://www.amazon.com/How-Measure-Anything-Intangibles-Busi...

mindcrimeonJan 19, 2020

Foundations of Decision Analysis by Hubbard.

Sounds interesting, but I couldn't find this in a quick preliminary search. Do you have a link handy? The only book titled "Foundations of Decision Analysis" I came across was by Howard and Abbas.

Also, not sure if this is related to the Hubbard you refer to or not, but there's a gentleman named Douglas Hubbard who has written some really excellent material in this area. I consider his book How To Measure Anything to be one of the best / most important books I've read, and it's one I recommend to pretty much everybody.

petewailesonNov 30, 2020

It's also just basic hypothesis testing - formulating and then proving/disproving a null hypothesis. But most of the people I talk to can't remember what that means, if they ever knew. Hence assumptions and attacking as the language - they can work that out.

Also, Farsighted is a great book. If you liked it, you'd also get a kick out of Creating Great Choices, The Choice Factory, Alchemy: The Surprising Power of Ideas That Don't Make Sense, and How to Measure Anything: Finding the Value of Intangibles in Business.

jacobkgonDec 23, 2018

How to Measure Anything - Douglas Hubbard

This book is a treatise against the notion that some important things can’t be measured. Full of information about how to figure what should be measured and then how to measure it. Very thorough and he managed to answer every objection I could come up with throughout.

Deep Work - Cal Newport

Starts with the thesis that a generation of workers have forgotten how to concentrate on mentally challenging tasks. Full of ideas and inspiration for rebuilding your stamina for intense focused thought.

kozakonJuly 22, 2016

1. "The Lean Startup" by Eric Ries

2. "Insanely Simple" by Ken Segall

3. "How to Measure Anything" by Douglas W. Hubbard

ozgooenonDec 31, 2015

Honestly, my favorite resource for much of this is the book How to Measure Anything by Douglas Hubbard. He goes into detail in understanding the value of information and how and why to use Monte Carlo Simulations.

Video: https://www.youtube.com/watch?v=w4fHGTsZZD8
Book: http://www.amazon.com/How-Measure-Anything-Intangibles-Busin...

mindcrimeonMar 10, 2020

It’s really not very good

Compared to what?

But my advice is, skip this one.

And read what instead?

Not trying to start an argument here, I'm genuinely curious, as I consider How To Measure Anything to be one of the best books I've ever read (and I read a lot of books), and I recommend it highly to, well, pretty much everybody. If you feel that there's a better resource out there that relates to these topics, I'd be curious to know about it.

JDDunn9onFeb 6, 2019

How to Measure Anything, by Douglas W. Hubbard.

When he says anything, he means anything. How to measure the value of a human life, how to estimate things you know nothing about (like the gestation period of an African elephant), and how to get better at measuring things with limited information.

ivvonApr 21, 2017

From personal experience of doing a bunch of interviews and surveys over the years, after about 6, you'll start hearing patterns. If you are new to the subject you are researching, doing some reading and talking to about six people will get you to a point where you'll be able to evolve the questions you are capable of asking, or formulate hypotheses for testing.

"How To Measure Anything" has a great chapter on how talking to only a few people can reduce uncertainty with a pretty amazing accuracy, but I don't have a copy handy.

batterseapoweronMar 9, 2020

If you like the idea of this library you'll probably like the book
"How to Measure Anything" by Douglas Hubbard
(https://www.goodreads.com/book/show/20933591-how-to-measure-...). It's all about how to get sensible confidence intervals for things that are often considered unmeasurable such as the value of IT security. The book mostly uses Excel to do this modelling, but it looks like riskquant would be an excellent alternative on that approach, that for the more technically minded practitioner.

andrey_utkinonFeb 3, 2021

> What if the "value" is actually less than the cost of the task (but it's still absolutely necessary)?

This is a false premise. But it's surprising how many people seem to hold it, to their peril.

> I have a feeling that all you're doing is shifting the complexity around. The underlying complexity is still there

That's right, but do you agree this approach moves it to a place where it makes more sense, where it informs good decisions and is manageable?

> it's impossible to accurately estimate a development task and impossible to measure developer productivity

It's not impossible, but it's not something we as a society or an industry have a common fine grasp on. On this topic, I like best the books by Doug Hubbard: "How to measure anything" and "The failure of risk management: what is it and how to fix it".

It just requires yet another unusual mindset: probabilistic thinking, in addition to the above-established value-based thinking. You have to use a technique called calibrated probability assessment. We started practicing this at my workplace, and it seems to be working as intended, but we're not well calibrated yet.

lostphilosopheronApr 9, 2015

I'm a huge believer in going back to primary texts, and understanding where ideas came from. If you've liked a book, read the books it references (repeat). I also feel like book recommendations often oversample recent writings, which are probably great, but it's easy to forget about the generations of books that have come before that may be just as relevant today (The Mythical Man Month is a ready example). I approach the reading I do for fun the same way, Google a list of "classics" and check for things I haven't read.

My go to recommendations:

http://www.amazon.com/Structure-Scientific-Revolutions-50th-... - The Structure of Scientific Revolutions, Thomas Kuhn, (1996)

http://www.amazon.com/Pragmatic-Programmer-Journeyman-Master... - The Pragmatic Programmer, Andrew Hunt and David Thomas (1999)

Things I've liked in the last 6 months:

http://www.amazon.com/How-Measure-Anything-Intangibles-Busin... - How to Measure Anything, Douglas Hubbard (2007)

http://www.amazon.com/Mythical-Man-Month-Software-Engineerin... - Mythical Man Month: Essays in Software Engineering, Frederick Brooks Jr. (1975, but get the 1995 version)

http://www.amazon.com/Good-Great-Some-Companies-Others/dp/00... - Good To Great, Jim Collins (2001)

Next on my reading list (and I'm really excited about it):

http://www.amazon.com/Best-Interface-No-brilliant-technology... - The Best Interface is No Interface, Golden Krishna (2015)

ingqondoonMar 9, 2021

How to Measure Anything is a fantastic book. Here are the most significant insights you learn in the book

- how to measure anything; Hubbard actually comes through on the promise of the title - after finishing the book you will truly feel that the scope of what you can measure is massive. He does this by a change in the definition of what it means to measure something, but you realize his definition is more correct than the everyday intuitive one.

- value of information; Hubbard gives a good introduction to the VOI concept in economics, which basically lets you put a price on any measurement or information and prioritize what to measure

- motivation for 'back of the napkin' calcs; through his broad experience he has seen how a lot of the most important things that affect a business go unmeasured, and how his approach to 'measuring anything' can empower people to really measure what matters.

Reading this book provided one half of what I have been searching for for a long time - a framework for thinking about data science activities which is not based on hype, fundamentally correct and still intuitive and practical.

mindcrimeonFeb 14, 2018

If you're trying to compare 100+ ideas and choose the "best" one to explore, I'd suggest looking into a simulation based approach. Monte Carlo simulation[1] is probably a good place to start. There are dozens of textbooks that cover the topic.

Now the downside to this is that you have to have parameter ranges for the model to simulate, and you don't necessarily know the probability distribution for each variable in the model up front. That means you have to estimate/guess at them. This makes the exercise slightly error-prone. There is, however, a mechanism you can use to teach yourself (or others) to do a better job of estimation. The technique I'm thinking of is "calibrated probability assessment"[2].

The book How To Measure Anything[3] by Douglas Hubbard does a really nice job of laying out how to use calibrated probability assessments, mathematical models, and monte carlo simulation, to build a probability distribution for things that look hard/impossible to measure.

Anyway, if you build a model for all of your ideas, and monte carlo simulate all of them to get a probability distribution for the return, then you at least have something somewhat objective to base a decision on.

One last note though: when doing this kind of simulation, one big risk (aside from mis-estimating a parameter) is that you leave a particular parameter out completely. I don't know of any deterministic way to make sure you include all the relevant features in a model. The best way I know of to address that is to "crowd source" some help and get as many people as you can (people who have relevant knowledge / experience) to evaluate and critique your model.

[1]: https://en.wikipedia.org/wiki/Monte_Carlo_method

[2]: https://en.wikipedia.org/wiki/Calibrated_probability_assessm...

[3]: https://www.amazon.com/How-Measure-Anything-Intangibles-Busi...

dr_dshivonMar 8, 2021

Let me second "How to measure anything." I think it should be required reading for human beings.

theologiconDec 24, 2015

Got a new job in May so slowed me down, but got through around 8 this year.

I'm a Lencioni fan:

Death by Meeting -- Describes 3 types of meetings

Getting Naked -- Describes how to consult

I'm also a Marshall Goldsmith fan:

What Got You Here Won't Get You There - Once you get beyond a Director level with some mistakes, read this book

Mojo, How to Get It, How To Keep It - Another "look yourself in the mirror" book

Also:

21 Irrefutable Laws of Leadership - John Maxwell. A little prod to act more like a leader.

Ready Player One -- Ernest Cline, Great Young Adult Book. Escapist fantasy.

Every Shot Counts -- Mark Broady, Statistical Look At Golf, but has some smell of Kahnemann

To Kill A Mockingbird -- Timeless Classic I Never Got To. Loved Atticus. I won't read a Watchman if it spoils my view of what Atticus was all about.

Started But Not Finished:

Business Dynamics Thinking -- Sterman (out of MIT). I need to take off work to read this 'cause it is so massive. Basically it is control theory applied to business modelling. However, I am convinced if somebody can apply these models, it really is the best competitive advantage. However, too people willing to stick with it.

How to Measure Anything -- Douglas Hubbard. Sort of makes me mad because it is so commonsense, yet most businesses don't apply this commonsense approach.

clavalleonApr 7, 2017

> There's no way you could in good conscience employ someone who couldn't do the work

Oh, if it was only so simple as 'can do the work' | 'cannot do the work'!

It is, of course, a very course and rough estimate but it is more like 'This person is likely to be great! They will not only do the work but surprise and delight me and pull our organization forward more solidly and quickly then we even hoped. We are lucky they happened to be looking just at the time we need their skills!' or 'This person looks like a good candidate but they tend to jump jobs on a yearly basis and has experience in C# rather than Java much less Scala but we've been looking for a while and we really need to get started and they seem smart enough to get up to speed.'

Point is, it's a complicated process with complicated factors.

Also, there is the wider market to consider. If I don't think a candidate can go out and get another job for the same price or more, I am unlikely to offer a premium on top of that market price.

So I have to disagree; skilled jobs are not binary in nature -- the max of the range offered represents how much a top tier candidate that fits well with the position should target. For those others without the same risk/reward profile or market power, they will have to keep from overselling themselves and take those factors into account.

If you like Algorithms to Live By, I think you'd enjoy 'How to Measure Anything' which really gets into the nitty gritty of how to reduce uncertainty and how even modest reductions can lead to much more solid decisions with these kinds of inherently fuzzy and complex problems.

mindcrimeonDec 23, 2018

How to Measure Anything - Douglas Hubbard

I personally consider this one of the most valuable non-fiction books I've ever read. It would be hard for me to state emphatically enough how strongly I recommend this book and the author's approach. Using calibrated probability assessments, an understanding of nth order effects, and Monte Carlo simulations, is a process that everyone should have in their toolkit.

The stuff on AIE and portfolio management I found less valuable, but all in all it's a great book.

aozgaaonAug 14, 2020

> As a discrete set of alternatives start to emerge, I strongly suggest quantitatively modeling out the impact of each one and revisit your Setting — specifically the why, the optimization function. It can be very hard with ambiguous decisions to get down into the numbers, but it’s very valuable to do so. Figure out your value metric, your success criteria.

In software engineering decisions, estimating the superiority of an alternative (eg: some performance measure) can be a long process depending in the level of detail required. Napkin calculations can be done in minutes but building a prototype can take as long as delivering a working product.

When the assessment of alternatives is abridged, confidence in the decision is compromised. This may be one reason (not including changing requirements) why agile methods are popular — they absolve stakeholders of committing to an alternative up front.

Douglas Hubbard’s “How to Measure Anything” discusses some ways to assess the information-value of improving estimates. His approach can help strike a compromise between doing napkin calculations and building fully-functional systems.

mindcrimeonJuly 15, 2016

The Four Steps To The Epiphany - Steve Blank

Code by Charles Petzold

Artificial Life - Steven Levy

Time Reborn - Lee Smolin

The Singularity is Near - Ray Kurzweil

Surfaces and Essences - Douglas Hofstadter

How to Measure Anything - Douglas Hubbard

-- One of my favorites is How Not to Be Wrong by Jordan Ellenbreg

I have that on my list of "to read real soon now". Sounds fascinating.

rahimnathwanionApr 23, 2021

The book 'How to measure anything' talks about how to improve your own calibration.

From what I recall (it's been a while since I read it), the author recommended testing yourself to estimate a bunch of questions with numerical answers. For example: what's the height of the empire state building?

You write down a range for each answer, trying to make the range narrow enough that you're just about 80% confident that the actual answer is within the range.

By doing this repeatedly, and periodically reviewing your cumulative correct rate, you can calibrate appropriately (e.g. widening or narrowing your ranges for future questions).

mindcrimeonMar 8, 2021

Depending on the context, I'm a fan of the work of Douglas Hubbard, in his book How to Measure Anything[1]. His approach involves working out answers to things that might sometimes be done as a "back of the napkin" kind of thing, but in a slightly more rigorous way. Note that there are criticisms of his approach, and I'll freely admit that it doesn't guarantee arriving at an optimal answer. But arguably the criticisms of his approach ("what if you leave out a variable in your model?", etc.) apply to many (most?) other modeling approaches.

On a related note, one of the last times I mentioned Hubbard here, another book came up in the surrounding discussion, which looks really good as well. Guesstimation: Solving the World's Problems on the Back of a Cocktail Napkin[2] - I bought a copy but haven't had time to read it yet. Maybe somebody who is familiar will chime in with their thoughts?

[1]: https://www.amazon.com/How-Measure-Anything-Intangibles-Busi...

[2]: https://www.amazon.com/gp/product/0691129495/ref=ppx_yo_dt_b...

stiffonJuly 2, 2019

IIRC the book "How to measure anything" contains some advice like this:
https://www.amazon.com/dp/1118539273/

The author also offers webinars, so maybe it was from him:
https://www.howtomeasureanything.com

bluGillonApr 13, 2017

Actually it is a good metric. However the metric is not a linear shape, but within 20% of linear. If you are not within 20% you need to do something about it. In general once you are within that further effort towards perfection is not desirable.

Remember the real goal: money. Sometimes it is money from sales, sometimes it is money saved. (even in case of a charity where there are higher goals money is a proxy for the real goal since it can be applied to the goal in some other way). If management can predict with reasonable accuracy when features will be done they can translate that into how much it will cost. Then they compare cost vs expected rewards (expected rewards is the job of marketing) and decide if they should focus on feature A, B, or both.

Note that many managers fail to understand error bars. There is no way to know exact numbers. However you can predict your likely error, and if the error is too high you can spend more money to reduce the error.

I recommend the book "how to measure anything" for more detail.

In the mean time when management wants perfect linear burn down charts, there is only one way to achieve them: overestimate your stories, finish them early and then go home. If you are paid for a 40 hour week you should average about 15 hours a week, but once in a while you will need to work a 40 hour week (60 hours every 10 years or so). Most management considers this unreasonable (for obvious reasons), but if a perfect linear slope actually is that important to them they will agree.

arikronMay 17, 2018

To anyone who reads "Measure What Matters," I strongly recommend as an additional book "How to Measure Anything" - I think it'll be really useful in enabling people to measure things that they may have previously assumed to be immeasurable, which means that these things can then be better optimized for and improved.

mindcrimeonFeb 21, 2021

The one way I know of to quantify something like "what's the value of preventing something bad from happening" is to use the kind of techniques described in Douglas Hubbard's How To Measure Anything[1]. Teach some experts within the customer company how to do calibrated probability assessment[2], then build a model, and run a Monte Carlo simulation[3] over that model. That gives you a way to at least loosely quantify things. It's not perfect, but for what you're doing it probably doesn't need to be.

[1]: https://www.amazon.com/How-Measure-Anything-Intangibles-Busi...

[2]: https://en.wikipedia.org/wiki/Calibrated_probability_assessm...

[3]: https://en.wikipedia.org/wiki/Monte_Carlo_method

blowskionJuly 20, 2016

If they could be easily derived, we'd all be doing it all the time. Spend some time doing it before you apply for your next job (or salary review) and you might be pleasantly surprised at how well the conversation goes. I linked to "How to Measure Anything" in another comment, and that's a good read - https://www.amazon.co.uk/How-Measure-Anything-Intangibles-Bu....

If you really can't find a way for your current job, then say how many downloads your open source project has got. Or how many comments or page views your blog gets. For some reason, employers get excited when I tell them "I'm in the top 3% on StackOverflow". (Yes, I know how ridiculous that sounds.)

But that guy who earns twice as much you and does half the work? This is what he does. He talks in the language of the people who decide his salary, and that language involves specific numbers that matter to the business.

mindcrimeonJan 19, 2020

How To Measure Anything[1] by Douglas Hubbard.

The basic gist of the book goes something like this: in the real world (especially in a business setting) there are many things which are hard to measure directly, but which we may care about. Take, for example, "employee morale" which matters because it may affect, say, retention, or product quality. Hubbard suggests that we can measure (many|most|all|??) of these things by using a combination of "calibrated probability assessments"[2], awareness of nth order effects, and Monte Carlo simulation.

Basically, "if something matters, it's because it affects something that can be measured". So you identify the causal chain from "thing" to "measurable thing", have people who are trained in "calibrated probability assessment" estimate the weights of the effects in the causal chain, then build a mathematical model, and use a Monte Carlo simulation to work out how inputs to the system affect the outputs.

Of course it's not perfect, since estimation is always touchy, even using the calibration stuff. And you could still commit an error like leaving an important variable out of the model completely, or sampling from the wrong distribution when doing your simulation. But generally speaking, done with care, this is a way to measure the "unmeasurable" with a level of rigor that's better than just flat out guessing, or ignoring the issue altogether.

[1]: https://www.amazon.com/How-Measure-Anything-Intangibles-Busi...

[2]: https://en.wikipedia.org/wiki/Calibrated_probability_assessm...

tryitnowonMar 3, 2016

For a more formal treatment of using probabilities to estimate valuations read Douglas Hubbard's How to Measure Anything. It's an eye-opening approach to thinking about things that are hard to measure (like startup valuations).

Hubbard uses elements of information theory to help structure measurement problems. His key insight is that it's worth paying to reduce uncertainty. Most people know this intuitively but resist actually putting numbers to the idea.

Staged funding rounds are a way to reduce uncertainty.

mindcrimeonDec 25, 2014

The Four Steps To The Epiphany - Steve Blank

Neuromancer - William Gibson

Predictable Revenue - Aaron Ross, Marylou Tyler

The Fountainhead - Ayn Rand

The Ultimate Question 2.0 - Fred Reichheld‎

The Singularity is Near - Ray Kurzweil

Moonshot! - John Sculley

Zero To One - Peter Thiel

Republic - Plato

Meditations - Marcus Aurelius

Nineteen Eighty-Four - George Orwell

Fahrenheit 451 - Ray Bradbury

The Mysterious Island - Jules Verne

Discipline of Market Leaders - Michael Treacy, Fred Wiersema

False Memory - Dean Koontz

NOS4A2- Joe Hill

Revival - Stephen King

Barbarians At The Gate - John Helyar and Bryan Burrough

Into Thin Air - John Krakauer

How To Measure Anything - Douglas Hubbard

and any collection of the works of H.P. Lovecraft.

mindcrimeonOct 17, 2018

"Always Be Leaving" ~ Jeff Thull

I'm a big fan of Jeff Thull's approach to sales, as laid out in his books Mastering The Complex Sale, Exceptional Selling, and The Prime Solution. It really goes against the grain of the old school "grab 'em by the throat and don't let go until they buy" mentality. He rejects the kind of stuff you might associate with the sales guys in "Glengarry Glenn Ross" and advocates a much more respectful and honest approach, where the goal is to serve a role closer to that of a doctor or a private detective, than the stereotypical "used car salesman" type.

I couldn't do it justice trying to explain it here, but if this all sounds interesting, I really recommend reading at least Mastering The Complex Sale to get the idea straight from the source.

I also recommend reading How To Measure Anything by Douglas Hubbard. It has nothing to do with sales, at least on the surface. But in terms of understanding customer problems, I think the approach Hubbard espouses can be tremendously useful at a certain point in the process. And I think it can tie back to Thull's idea that if you work closely with the customer to actually jointly develop a solution and explain the value it creates, then there won't be any of the typical "closing" issues, since there won't be any question about the value of the solution.

stdbrouwonDec 16, 2015

Well, if you really want to learn how to think probabilistically in everyday life, I'd recommend Douglas Hubbard's "How to Measure Anything" which contains detailed advice for how to calibrate your estimates (so you don't continually over- or underestimate the probability of various events) and how base risk management and strategic decision making on this knowledge. Probably useful for the startup folk here.

http://smile.amazon.com/How-Measure-Anything-Intangibles-Bus...

rramadassonJuly 2, 2019

>Even bad estimates are better than no estimates

Absolutely not! An estimate should not be a random number but should be constrained with available data however small it might be. If you feel that you don't have enough to form a "guesstimate" do not give me a number but first work on finding the data which will enable you to form a proper estimate.

Once you give an estimate, no matter how many times you explain that it is a "guesstimate" people tend to lock on to the given number. It then becomes a real battle trying to explain the hurdles (and there are always some unknowns) while revising the original estimate. Soon mutual distrust develops between the implementation engineers (stressful and detrimental to actual execution) and the management leading to everybody losing faith in estimates. Agile/Scrum have exacerbated the problem with their short time windows and sprints. In one team that i was on, people just gave up and started quoting 2 weeks for any and every feature, trivial or non-trivial and the whole exercise became meaningless.

PS: The book "How to Measure Anything: Finding the Value of Intangibles in Business" is worth reading to get some ideas on how one might do proper estimation.

rahimnathwanionJuly 13, 2014

The overconfidence/underconfidence issue is similar to that expressed in the book 'How to measure anything'. The author claims to have trained people to become better estimators, by presenting them estimation challenges, and providing feedback about how often the actual value was within their stated range. IIRC people were asked to present a range for which they were 80% confident. If the answer was within the range too often (e.g. 90%) then their ranges were too wide, and if not often enough, they were overconfident. Over time, most subjects improved their calibration.

I started (yesterday) working on a simple web app to train people in this way. It's not yet ready to try out, but you can bookmark it here: https://github.com/rahimnathwani/measure-anything

mindcrimeonOct 28, 2017

We need ways to translate these into numbers that we can compare with profit margins, etc.

There is a way. I'll refer you to How To Measure Anything by Douglas Hubbard. His model is based on a combination of things:

1. Calibrated Probability Assessments

2. nth order effects

3. building a mathematical model

4. Monte Carlo simulation

apply his methodology and you can determine the impact of "hard to quantify" variable like "security" and get a probability distribution that can be used to assign values to specific scenarios.

Yeah, it's a little bit complicated and time-consuming; but the best things in life are, no?

qznconMay 23, 2020

From Douglas W. Hubbard, How to Measure Anything (3rd ed.) via https://lobste.rs/s/kk89vp/back_envelope_estimation_hacks#c_...

> There is a 93.75% chance that the median of a population is between the smallest and largest values in any random sample of five from that population.

> It might seem impossible to be 93.75% certain about anything based on a random sample of just five, but it works. To understand why this method works, it is important to note that the Rule of Five estimates only the median of a population. Remember, the median is the point where half the population is above it and half is below it. If we randomly picked five values that were all above the median or all below it, then the median would be outside our range. But what is the chance of that, really?

> The chance of randomly picking a value above the median is, by definition, 50%—the same as a coin flip resulting in “heads.” The chance of randomly selecting five values that happen to be all above the median is like flipping a coin and getting heads five times in a row. The chance of getting heads five times in a row in a random coin flip is 1 in 32, or 3.125%; the same is true with getting five tails in a row. The chance of not getting all heads or all tails is then 100% − 3.125% × 2, or 93.75%. Therefore, the chance of at least one out of a sample of five being above the median and at least one being below is 93.75% (round it down to 93% or even 90% if you want to be conservative).

mindcrimeonMay 29, 2017

Gosh, there's so many. But these come to mind:

1. Neuromancer - William Gibson

2. Snow Crash - Neal Stephenson

3. Hackers - Heroes of the Computer Revolution - Steven Levy

4. How to Measure Anything - Douglas Hubbard

5. Godel, Escher, Bach - Douglas Hofstadter

6. The Pragmatic Programmer - Andy Hunt and Dave Thomas

7. The Soul of a New Machine - Tracy Kidder

8. Code - Charles Petzold

9. The Shockwave Rider - John Brunner

10. Ambient Findability: What We Find Changes Who We Become
Book
- Peter Morville

11. Don't Make Me Think - Steve Krug

12. The Design of Everyday Things - Donald A. Norman

13. The Mythical Man-Month: Essays on Software Engineering - Fred Brooks

14. Decline and Fall of the American Programmer - Ed Yourdon

15. Cube Farm - Bill Blunden

16. The Philip K. Dick Reader

17. The Cuckoo's Egg - Clifford Stoll

18. The Prince - Niccolò Machiavelli

19. The 48 Laws of Power - Robert Greene

20. The Atrocity Archives - Charles Stross

21. Business @ the Speed of Thought: Using a Digital Nervous System - Bill Gates

Built withby tracyhenry

.

Follow me on