Tuesday, April 21, 2015

Both rational optimizing and conspiracy over-estimated.

A good old bulletin board conversation that I just came across, Grand truths about human behavior by Edward Tufte. Edward Tufte is of course the author of several wonderful books dealing with visual communication such as The Visual Display of Quantitative Information.

Kind of a mish-mash. Here are the one's that I thought were particularly interesting:
It's more complicated than that.

Unintended consequences inevitably attend purposive social action. (Robert K. Merton)

All the world is multivariate.

Much of the world is distributed lognormally.

People are different.

Rehearsal improves performance.

Effective intervention-thinking and choice-thinking necessarily require reasoning about comparisons, approximations, opportunity costs, and causality.

All grand theories, other than perhaps the scientific method, ultimately err (and some collapse) by overreaching.

"For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled." (Richard Feynman)

In explanations of human activities, both muddling through and incompetence are under-estimated, and both rational optimizing and conspiracy over-estimated.

Nearly all self-assessments claim above-average performance.

It is better to be roughly right than precisely wrong.

Authority gone to one's head is the greatest enemy of truth.

It is better to be roughly right than precisely wrong.

An approximate answer to the right question is worth a great deal more than a precise answer to the wrong question.

There never was anything by the wit of man so well-devised or so sure established which in continuance of time has not been corrupted.

If something cannot go on forever, it will stop.

90 percent of everything is crud.

It is a human trait to organize things into categories. Inventing categories creates an illusion that there is an overriding rationale in the way the world works.

No matter how many options there are, it is human nature to always narrow things down to two polar, yet inextricably linked choices.

Personal truths are often perceived as universal truths. For instance it is easy to imagine a system or design that works well for oneself will work for everyone else.

It's not what you're thinking, it's what they're seeing.

Never attribute to malice that which can be adequately explained by stupidity. (Hanlon's Razor)

Sufficiently advanced stupidity is indistinguishable from malice. (Grey's Corollary)

It is the commonest of mistakes to consider that the limit of our power of perception is also the limit of all there is to perceive.

Faced with the choice between changing one's mind and proving there is no need to do so, almost everyone gets busy on the proof.

There is a principle which is a bar against all information, which is proof against all arguments and which cannot fail to keep a man in everlasting ignorance - that principle is contempt prior to investigation.

Nothing is always absolutely so. (Sturgeon's Law)
Later on there is this observation which addresses one of the issues in decision-making. In the Decision Clarity Consulting methodology, after having clarified definitions, provided measurable context and establishing shared assumptions, the first question about any statement or phenomenon is: Is it real? That is followed by Do we understand the causes? Describe the issue, Measure it, Explain it.

Tufte has something of a variance of that.
"If you know nothing, take the average or use persistence forecasting. To describe something, observe averages and variances, along with deviations from persistence forecasting. Understanding, however, requires causal explanations supported by evidence."

"Average" is meant both in the statistical sense and in the wisdom-of-crowds sense. "Persistence forecasting" is, for example, saying the tomorrow will be like today (and is often a hard forecast to beat).

There might be a grander (although probably more cryptic) way of making the point about the difference between description and causal explanation. Perhaps this point can be combined with the point about the requirements of intervention thinking.
He then elaborates and asks an important question.
Stereotyping and its issues (useful generalization v. destructive ignorance) are picked up by my "know nothing, explain something" item:

"If you know nothing, take the average or use persistence forecasting. To describe something, observe averages and variances, along with deviations from persistence forecasting. Understanding, however, requires causal explanations supported by evidence." "Average" is meant both in the statistical sense and in the wisdom-of-crowds sense. "Persistence forecasting" is, for example, saying the tomorrow will be like today (and is often a hard forecast to beat)."

Thus stereotyping is a "know nothing" strategy which often works, but sometimes there are relevant variances that should assessed and explained.

The deeper issue is: What strategies identify those situations when variances become relevant? When is it worthwhile to consider possible anomalies from average or persistence forecasting--and not merely reopen solved problems?
Stereotyping is conceptually much maligned in the common vernacular for obvious historical reasons. None-the-less, in the absence of understanding of some observed phenomenon, stereotyping is the closest thing we can do cognitively to address the unknown. Stereotyping can consist of either finding the closest category to which to fit the phenomenon and hope that it is close enough to be useful or, as Tufte counsels, "take the average or use persistence forecasting."

The stereotype may not be right but in the absence of any information at all, it triggers "It is better to be roughly right than precisely wrong."


No comments:

Post a Comment