SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Idea Of The Day -- Ignore unavailable to you. Want to Upgrade?


To: IQBAL LATIF who wrote (49442)11/15/2005 4:19:30 PM
From: IQBAL LATIF  Respond to of 50167
 
Scepticism is rare, or, Descartes vs. Spinoza

By James Montier








Sometime ago a client asked us to compile a list of myths that the markets seemed to hold dear. We came up with twelve potential myths ranging from stocks for the long run to dividends don't matter via such topics as commodities for the future and bond supply matters. However, this exercise also
made me wonder why it was that supposedly smart people ended up believing such strange things.

This pondering sent me (as is usually the case) to the annals of psychology. To some extent these errant beliefs seem to stem from bounded awareness/inattentional blindness and framing. We have explored such elements before. However, there may well be anotherfactor at work. We seem to be hard wired to 'believe'.

Daniel Gilbert, a professor of psychology at Harvard, has explored how we go about believing and understanding information. In a series of truly insightful papers1 Gilbert and co-authors have explored the belief process using two alternative philosophical viewpoints.
Cartesian systems

The first view is associated with the work of Rene Descartes. When it came to belief, Descartes suggested the mind performs two separate mental acts. First it understands the idea. Secondly, the mind assesses the validity of the idea that has been presented. This two stage process seems intuitively correct. After all, we can all imagine being presented with some novel idea, holding it in our minds and then pondering the truth or otherwise associated with the idea. The Cartesian approach fits well with folk psychology.

Descartes was educated by Jesuits and like many 17th century
philosophers
generally deployed psychology and philosophy in the aid of
theology.. Like anyone
of any sense Descartes was well aware that people were capable of
believing
things that weren't true. In order to protect the Church, Descartes
argued that
God had given man the power to assess ideas. So it clearly wasn't
God's fault
when people believed things that weren't true.

As Gilbert (1993, op cit) notes, Descartes approach consisted of
two axioms.
Firstly, the mental separation and sequencing of understanding and
believing and
secondly, that people have no control over how or what they
understand, but are
totally free to believe or disbelieve ideas as they please.

Spinozan systems

Spinoza's background and thinking could not be much more different
than
Descartes. Born a Jew, Barauch de Espinoza (later to become
Benedict Spinoza)
outraged his community and synagogue. The tensions finally resulted
in Spinoza
being excommunicated, accused of abominable heresies and monstrous
deeds. The
order of excommunication prohibited other members of the synagogue
from having
any contact with Spinoza.

Freed of the need to conform to his past, Spinoza was able to
explore anything
he chose. One of the areas he turned his considerable mental
prowess to was the
faults contained in the Cartesian approach. Spinoza argued that all
ideas were
first represented as true and only later (with effort) evaluated
for veracity.
Effectively Spinoza denied the parsing that Descartes put at the
heart of his
two step approach. Spinoza argued that comprehension and belief
were a single
step. That is to say, in order for somebody to understand
something, belief is a
necessary precondition. Effectively all information or ideas are
first accepted
as true, and then only sometimes evaluated as to their truth, once
this process
is completed a 'corrected belief' is constructed if necessary.

Libraries

Gilbert et al (1990, op cit) use the example of a library to draw
out the
differences between these two approaches. Imagine a library with
several million
volumes, of which only a few are works of fiction. The Cartesian
approach to
filing books would be to put a red tag on each volume of fiction
and blue tag on
each volume of non-fiction. Any new book that appeared in the
library would be
read, and then tagged as either fiction or nonfiction. Any book
that is unread
is simply present in the library until it is read.

In contrast, a Spinozan library would work in a very different
fashion. Under
this approach a tag would be added to each volume of fiction but
the non-fiction
would be left unmarked. The ease of this system should be clear; it
requires a
lot less effort to run this system than the Cartesian approach.
However, the
risk is that if a new book arrives it will be seen as non-fiction.

Gilbert et al note that under ideal conditions both systems produce
the same
outcome if allowed to run to conclusion. So if you pick up a copy
of Darwin's
'The expression of emotions in man and animals' and asked the
Cartesian
librarian what he knew about the book, he would glance at the tag
and say
non-fiction. The Spinozan librarian would do pretty much the same
thing,
concluding the book was non-fiction because of the absence of a
tag..

However, imagine sneaking a new book into the library, say the
latest Patricia
Cornwell thriller. If you took the book to the librarian and asked
them what
they knew about the book, their response would reveal a lot about
the underlying
process governing the library's approach to filing. For instance,
the Cartesian
librarian would say "I don't know what sort of book that is. Come
back later
when it has been read and tagged appropriately". The Spinozan
librarian would
glance up and see the absence of a tag and say "it doesn't have a
tag so it must
be non-fiction" - an obviously incorrect assessment.

A testing structure

The picture below taken from Gilbert (1993) shows the essential
differences
between the two approaches, and also suggests a clever way of
testing which of
the two approaches has more empirical support.



Say an idea is presented to the brain2, and then the person
considering the idea
is interrupted in some fashion. Under a Cartesian system, the
person is left
merely with an understanding of a false idea, but no belief in it.
However, if
people are better described by a Spinozan approach then
interrupting the process
should lead to a belief in the false idea. So giving people ideas
or
propositions and then interrupting them with another task should
help to reveal
whether people are Cartesian or Spinozan systems when it comes to
beliefs.

The empirical evidence

It has long been known that distracting people can impact the
belief they attach
to arguments. For instance, in their 1994 review Petty et al3
report an
experiment from 1976 which clearly demonstrated the impact of
distraction
techniques4.

To test the impact of distraction, students were exposed to a
message arguing
that tuition at their university should be cut in half. Students
listened to the
ideas which were presented over headphones. Some heard strong
arguments, others
heard relatively weak arguments. At the same time, the students
were subjected
to a distraction task which consisted of tracking the positions of
Xs that were
flashed on a screen in front of them. In the high distraction
version of the
task, the Xs flashed up at a fast pace, in the low distraction task
the rate was
reduced heavily.

The results Petty et al found are shown in the chart below. When
the message was
weak, people who were highly distracted showed much more agreement
with the
message than did the people who only suffered mild distraction.
When the message
was strong and distraction was high, the students showed less
agreement than
when the message was strong and the distraction was low.
Distraction did exactly
what it was meant to do... prevented people from concentrating on
the important
issue.





Petty et al conclude "Distraction, then, is an especially useful
technique when
a person's arguments are poor because even though people might be
aware that
some arguments were presented, they might be unaware that the
arguments were not
very compelling." Something to bear in mind at your next meeting
with brokers
perhaps? The next time an analyst comes around and starts showing
you pictures
of the next generation of mobile phones, just stop and think about
the quality
of their investment arguments.

Is there more direct evidence of our minds housing a Spinozan
system when it
comes to belief? Gilbert et al (1990, op cit) decided to
investigate. They asked
people to help them with an experiment concerning language
acquisition in a
natural environment. Participants were shown ostensibly Hopi words
with an
explanation (such as a monishna is a bat). They had to wait until
the
experimenter told them whether the word they had been given was
actually the
correct word in Hopi or whether it was a false statement.

Subjects also had to listen out for a specific sound which if they
heard
required them to press a button. The tone sounded very shortly
after the
participant had been told whether the statement was true or false.
This was
aimed at interrupting the natural processing of information. Once
they responded
to the tone, the next Hopi word appeared preventing them from going
back and
reconsidering the previous item.

When subjects were later asked about their beliefs, if they worked
in a Spinozan
way then people should recall false propositions as true more often
after an
interrupt than the rest of the time. As the chart below shows, this
is exactly
what Gilbert et al uncovered.



Interruption had no effect on the correction identification of a
true
proposition (55% when uninterrupted vs. 58% when interrupted).
However,
interruption did significantly reduce the correct identification of
false
propositions (55% when uninterrupted vs. 35% when interrupted).
Similarly one
could look at the number of true-false reversals (the right side of
the chart
above) When false propositions were uninterrupted, they were
misidentified as
true 21% of the time, which was roughly the same rate as true
propositions were
identified as false. However, when interrupted the situation
changes, false
propositions were identified as true some 33%, significantly higher
than the
number of true propositions were identified as false (17%).

In another test Gilbert et al (1993, op cit) showed that this habit
of needing
to believe in order to understand could have some disturbing
consequences. They
set up a study in which participants read crime reports with the
goal of
sentencing the perpetrators to prison. The subjects were told some
of the
statements they would read would be false and would appear on
screen as red
text, the true statements would be in black text.

By design, the false statements in one case happened to exacerbate
the crime in
question; in the other case they attenuated the crimes. The
statements were also
shown crawling across the screen - much like the tickers and prices
on bubble
vision. Below the text was a second row of crawling numbers. Some
of the
subjects were asked to scan the second row for the number (5) and
when they saw
it, they were asked to press a button.

At the end of experiment, subjects were asked to state what they
thought
represented a fair sentence for the crimes they had read about. The
chart below
shows that just like the previous example, interruption
significantly reduced
the recognition of false statements (69% vs. 34%), and increased
the recognition
of false statements as being true (23% vs. 44%).



The chart below shows the average recommended sentence depending on
the degree
of interruption. When the false statements were attenuating and
processing was
interrupted there wasn't a huge difference in the recommended jail
term. The
interrupted sentences were around 4% lower than the uninterrupted
ones. However,
when the false statements were exacerbating and interruption
occurred the
recommended jail term was on average nearly 60% higher than in the
uninterrupted
case!



Strategies to counteract naïve belief

The thought that we seem to believe everything in order to
understand it, is a
more than a little disconcerting. It would seem to render us
powerless to
control our beliefs. However, the absence of direct control over
our beliefs
doesn't necessarily imply we are at their mercy.

Two potential strategies for countering our innate tendency to
believe can be
imagined. The first is what Gilbert (1993, op cit) calls
'unbelieving'. That is,
we can try to carry out the required analytic work to truly assess
the veracity
of an idea. This certainly appeals to the empiricist in me. My own
personal
viewpoint is that we should accept very little at face value and
use evidence to
assess how likely the proposition actually is.

For example, we are often told that stock markets earnings can grow
faster than
nominal GDP over extended periods. Of course, in year to year terms
there isn't
a close linkage between the two. However, in the long run earnings
(and
dividends) have grown substantially below the rate of nominal GDP
growth (on
average earnings have grown 1 - 2% below the rate of nominal GDP;
see Global
Equity Strategy, 16 August 2002 Return of the Robber Barrons for
more on this).

Growth investing is another example. I don't doubt that some growth
investors
are very successful. However, the empirical evidence shows that
picking winners
in terms of growth is exceptionally difficult and fraught with
danger (in that
buying expensive stocks with high embodied expectations obviously
opens up
considerable downside risk if reality falls short of expectations).
In contrast,
buying cheap stocks offers a margin of safety against
disappointment. (See
Global Equity Strategy, 16 March 2005, Bargain Hunter for more
details).

So regularly confronting beliefs with empirical reality is one way
of trying to
beat the Spinozan system. However, 'unbelieving' is a risky
strategy since it
relies on you having the cognitive wherewithal to be on your guard.
Gilbert et
al have shown that cognitive load, pressure and time constraints
all undermine
our ability to reject false beliefs.

The second potential belief control mechanism is called 'exposure
control'. This
is a far more draconian approach than 'unbelieving'. False beliefs
can be
avoided by avoiding all beliefs, just as a dieter who loves
doughnuts may choose
to avoid shops that sell doughnuts, we can try to avoid sources of
information
that lead us to hold false beliefs. This is a conservative strategy
that errs on
the side of exclusion, it excludes false beliefs, but it may also
exclude some
true beliefs. However, it doesn't suffer from the problems of
overload, pressure
or time constraints unlike the 'unbelieving' strategy.

All of this suggests that a combination of these strategies is
likely to be
optimal. When you are really trying to assess the validity of an
argument do
your best to avoid distraction. Turn off your screens, put your
phone on call
forward, and try to cut yourself off from all the sources of noise.
Of course,
management and colleagues may well think you have taken leave of
your sense as
you sit there with your screens off, but try to ignore them too. If
you are
likely to be distracted then either wait until later, when you can
give the
assessment the time and effort it requires or simply follow an
exclusion
strategy.

Footnotes:
1 Gilbert, Krull, Malone (1990) Unbelieving the unbelievable: Some
problems in
the rejection of false information, Journal of Personality and
Social
Psychology, 59. Gilbert (1991) How mental systems believe, American
Psychologist, 46 Gilbert, Tafarodi, Malone (1993) You can't believe
everything
you read, Journal of Personality and Social Psychology, 65 Gilbert
(1993) The
Assent of Man: mental representation and the control of belief in
Wegner and
Pennebaker (eds) The Handbook of Mental Control

2 This hints that we support a Spinozan view of the human mind.
Descartes was
famous for arguing the difference between the brain and the mind,
Spinoza in
contrast saw the two as impossible to separate, they are two sides
of the same
coin from a Spinozan viewpoint. For more on this see Antonio
Damasio's first and
third books, Descartes' Error and Looking for Spinoza respectively.

3 Petty, Cacioppo, Strathman and Priester (1994) To think or not to
think, in
Brock and Shavitt The Psychology of Persuasion

4 Petty, Wells and Brock (1976) Distraction can enhance or reduce
yielding to
proganda, Journal of Personality and Social Psycholgy, 34















I hope you enjoyed the latest research from Montier.

Your trying not to be distracted from the truth analyst,


John F. Mauldin
johnmauldin@investorsinsight.com



To: IQBAL LATIF who wrote (49442)11/16/2005 12:57:17 AM
From: IQBAL LATIF  Read Replies (1) | Respond to of 50167
 
Idea Of The Day

When a man gains wealth within,
He shows it with pride without.
When the clouds are full of water,
They move and rumble with thunder.

Saskya Pandita (1182-1251)
Tibetan Grand Lama of Saskya