1:57 pm ET
Sep 25, 2015 |
Energy
|
Can You See the Future? Probably Better
Than Professional Forecasters |
|
By
Jason Zweig
—Christophe Vorlet |
Can’t anybody here play this game?
Three-quarters of all U.S. stock mutual funds
have failed to beat the market over the past
decade. Last year, 98% of economists
expected interest rates to rise;
they fell instead. Most energy analysts didn’t foresee oil’s collapse from
$145 a barrel in 2008 to $38 this summer — or its 15% rebound since.
A new book suggests that amateurs might well be less-hapless
forecasters than the experts — so long as they go about it the right
way.
I think Philip Tetlock’s “Superforecasting: The Art and Science of
Prediction,” co-written with the journalist Dan Gardner, is the most
important book on decision making since Daniel Kahneman’s “Thinking,
Fast and Slow.” (I helped write and edit the Kahneman book but receive
no royalties from it.) Prof. Kahneman agrees. “It’s a manual to
systematic thinking in the real world,” he told me. “This book shows
that under the right conditions regular people are capable of
improving their judgment enough to beat the professionals at their own
game.”
The book is so powerful because Prof. Tetlock, a psychologist and
professor of management at the University of Pennsylvania’s Wharton
School, has a remarkable trove of data. He has just concluded the
first stage of what he calls the Good Judgment Project, which pitted
some 20,000 amateur forecasters against some of the most knowledgeable
experts in the world.
The amateurs won — hands down. Their forecasts were more accurate more
often, and the confidence they had in their forecasts — as measured by
the odds they set on being right — was more accurately tuned.
The top 2%, whom Prof. Tetlock dubs “superforecasters,” have
above-average — but rarely genius-level — intelligence. Many are
mathematicians, scientists or software engineers; but among the others
are a pharmacist, a Pilates instructor, a caseworker for the
Pennsylvania state welfare department and a Canadian underwater-hockey
coach.
The forecasters competed online against four other teams and against
government intelligence experts to answer nearly 500 questions over
the course of four years: Will the president of Tunisia go into exile
in the next month? Will the gold price exceed $1,850 on Sept. 30,
2011? Will OPEC agree to cut its oil output at or before its November
2014 meeting?
It turned out that, after rigorous statistical controls, the elite
amateurs were on average about 30% more accurate than the experts with
access to classified information. What’s more, the full pool of
amateurs also outperformed the experts.
The most careful, curious, open-minded, persistent and self-critical —
as measured by a battery of psychological tests — did the best.
“What you think is much less important than how you think,” says Prof.
Tetlock; superforecasters regard their views “as hypotheses to be
tested, not treasures to be guarded.”
Most experts — like most people — “are too quick to make up their
minds and too slow to change them,” he says. And experts are paid not
just to be right, but to sound right: cocksure even when the evidence
is sparse or ambiguous.
So the project was designed to force the forecasters “to be ruthlessly
honest about why they think what they do,” says Prof. Tetlock.
First, participants got
training materials explaining
the basics of how to think about probabilities in an uncertain world.
The forecasters were urged to forage for information that might
disprove their assumptions — and to change their minds at will,
tweaking their predictions as often as new evidence emerged.
One wrote a software program that sorted his online sources of news
and opinion by ideology, topic and geographic origin, then told him
what to read next in order to get the most-diverse points of view.
After each outcome, the superforecasters analyzed not just whether
their forecasts had been right, but also whether their reasoning was
right and the odds they had set were too high or low.
Warren Hatch, an analyst at McAlinden Research Partners, an
investment-research firm in New York, says he learned that “just
because you know a lot about something doesn’t mean you’ll be a good
forecaster in that area.” He says “it was humbling” for him to realize
that he “blew almost all” the questions closely related to finance.
Joshua Frankel, a filmmaker and opera director in Brooklyn, N.Y., says
the tournament taught him to “look at the world in a less binary way,
to think much more in terms of probabilities.”
You can cultivate these same skills by visiting
GJOpen.com and joining the
next round of the tournament. Or you can try refining your own
thinking.
Start by zeroing in on the “base rate” — the average historical
experience. If you’re considering whether to invest in an initial
public offering, don’t first bury yourself in the details of why this
particular company might be the next Google. Instead, begin with the
assumption that it will match the returns of the typical IPO – which underperforms
the rest of the stock market by two to three percentage points
annually in the long run.
Next, ask what the company would have to do to outperform that average
by, say, four percentage points annually — enough to beat the market
overall. Work up a list of all the companies in the past that have
done so and see which factors they seem to have in common. Does this
IPO have the same forces in its favor? Write down your reasoning in
detail and estimate the numerical odds, as precisely as you can, that
you are correct.
— Write to Jason Zweig at intelligentinvestor@wsj.com,
and follow him on Twitter at @jasonzweigwsj.
.”
|