SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Politics : Formerly About Advanced Micro Devices -- Ignore unavailable to you. Want to Upgrade?


To: koan who wrote (767585)2/3/2014 1:18:30 AM
From: average joe  Respond to of 1575396
 
Dangers of Believing Too Much in Science, Explained by Scientists

By Tara MacIsaac, Epoch Times | January 29, 2014

Last Updated: January 30, 2014 11:31 am

Some of the great minds that shaped the laws of science warn us not to be limited by those laws.

Many of the scientists who helped establish the widely accepted scientific theories and laws have warned future scientists not to be limited by their work. They also note that many of the greatest discoveries were ridiculed at first, as they stood in opposition to preconceived notions.

1. The Benefits of Being Scoffed At Rejoice when other scientists do not believe what you know to be true. It will give you extra time to work on it in peace. When they start claiming that they have discovered it before you, look for a new project.

—Efraim Racker, in “Resolution and Reconstitution of Biological Pathways from 1919 to 1984,” published in Federation Proceedings in 1983.

2. Scientists Should Let Go of Preconceived Notions You are urgently warned against allowing yourself to be influenced in any way by theories or by other preconceived notions in the observation of phenomena, the performance of analyses and other determinations.

—Emil Hermann Fischer, as quoted by M. Bergmann in “Das Buch der Grosse Chemiker” and translated by Joseph S. Froton in “Contrasts in Scientific Style: Research Groups in the Chemical and Biomedical Sciences.”

3. Strong Resistance to New Ideas The mind likes a strange idea as little as the body likes a strange protein, and resists it with a similar energy. It would not perhaps be too fanciful to say that a new idea is the most quickly acting antigen known to science. If we watch ourselves honestly, we shall often find that we have begun to argue against a new idea even before it has been completely stated. I have no doubt that that last sentence has already met with repudiation—and shown how quickly the defense mechanism gets to work.

—Wilfred Trotter, in ‘The Collected Papers of Wilfred Trotter F.R.S.,’ published in 1941.

4. Just Because It Can’t Easily Be Measured, Doesn’t Mean It Doesn’t Exist The first step is to measure whatever can easily be measured. This is OK as far as it goes. The second step is to disregard that which can’t be easily measured or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can’t be measured easily really isn’t important. This is blindness. The fourth step is to say that what can’t be easily measured really doesn’t exist. This is suicide.

–Charles Handy, Economist and organizational behaviorist, in his book ‘The Empty Raincoat: Making Sense of the Future’

5. Physical ‘Laws’ May Change We have no right to assume that any physical laws exist, or if they have existed up to now, that they will continue to exist in a similar manner in future. It is perfectly conceivable that one fine day Nature should cause an unexpected event to occur which would baffle us all; and if this were to happen we would be powerless to make any objection, even if the result would be that, in spite of our endeavors, we should fail to introduce order into the resulting confusion. In such an event, the only course open to science would be to declare itself bankrupt.

—Max Planck, in his book The Universe in the Light of Modern Physics, translated by W. H. Johnston.

Science cannot solve the ultimate mystery of nature. And that is because, in the last analysis, we ourselves are part of nature and therefore part of the mystery that we are trying to solve.

—Max Planck, in his book ‘Where Is Science Going?’ translated by James Murphy.

6. Science ‘Another Form of Religion’? We need not wait for science to give us permission to do the uncommon or go beyond what we have been told is possible. If we do, we make science another form of religion. We should be mavericks; we should practice doing the extraordinary.

—Joe Dispenza, in his book ‘Evolve Your Brain: The Science of Changing Your Mind’

7. What May Now Seem Ridiculous Could Be the Future of Science I have no doubt that in reality the future will be vastly more surprising than anything I can imagine. Now my own suspicion is that the universe is not only queerer than we suppose, but queerer than we can suppose.

—J.B.S. Haldane, in his book ‘Possible Worlds and Other Papers’

8. What Is Reason? Reason, with most people, means their own opinion.

—William Hazlitt, in his essay “A New School of Reform: A Dialogue between a Rationalist and a Sentimentalist.”

9. Most of Science is ‘Opinion and Illusion’ ‘By convention there is color, by convention sweetness, by convention bitterness, but in reality there are atoms and the void,’ announced Democritus. The universe consists only of atoms and the void; all else is opinion and illusion. If the soul exists, it also consists of atoms.”

—Edward Robert Harrison, in “Masks of the Universe”

10. Science Embraces Mystery “I love science, and it pains me to think that so many are terrified of the subject or feel that choosing science means you cannot also choose compassion, or the arts, or be awed by nature. Science is not meant to cure us of mystery, but to reinvent and reinvigorate it.”

—Robert Sapolsky, in his book “Why Zebras Don’t Get Ulcers”

11. Einstein on ‘Objective Truth’ Physical concepts are free creations of the human mind, and are not, however it may seem, uniquely determined by the external world. In our endeavor to understand reality we are somewhat like a man trying to understand the mechanism of a closed watch. He sees the face and the moving hands, even hears its ticking, but he has no way of opening the case. If he is ingenious he may form some picture of a mechanism which could be responsible for all the things he observes, but he may never be quite sure his picture is the only one which could explain his observations.

He will never be able to compare his picture with the real mechanism and he cannot even imagine the possibility or the meaning of such a comparison. But he certainly believes that, as his knowledge increases, his picture of reality will become simpler and simpler and will explain a wider and wider range of his sensuous impressions. He may also believe in the existence of the ideal limit of knowledge and that it is approached by the human mind. He may call this ideal limit the objective truth.

—Albert Einstein and Leopold Infeld, in their co-written book “The Evolution of Physics.”

12. Scientists Present Different Face on World Stage Than Behind the Curtain Deductivism in mathematical literature and inductivism in scientific papers are simply the postures we choose to be seen in when the curtain goes up and the public sees us. The theatrical illusion is shattered if we ask what goes on behind the scenes. In real life discovery and justification are almost always different processes.

—Peter Brian Medawar, in “Induction and Intuition in Scientific Thought”

13. Science Doesn’t Claim Certainty, Emotional Objectivity A common fallacy in much of the adverse criticism to which science is subjected today is that it claims certainty, infallibility and complete emotional objectivity. It would be more nearly true to say that it is based upon wonder, adventure and hope.

—Cyril Hinshelwood, as quoted in E. J. Bowen’s obituary of Hinshelwood, published in 1967 in the journal Chemistry in Britain.

14. What Is, Is A man should look for what is, and not for what he thinks should be.

—Albert Einstein, as quoted in Peter Michelmore’s book “Einstein, Profile of the Man.”

15. Fatal Flaw of Science Nothing is so fatal to the progress of the human mind as to suppose that our views of science are ultimate; that there are no mysteries in nature; that our triumphs are complete, and that there are no new worlds to conquer.

—Sir Humphry Davy, as quoted by David Knight in the book ‘Humphry Davy: Science and Power’

Sir Humphry Davy was a chemist and inventor. He discovered sodium, potassium, and calcium using electrolysis, and found that chlorine, previously thought to contain oxygen, is an element. He also invented the Davy lamp, a lamp that is safe to use in coal mines.

16. Soul Will Not Be Explained By Science I maintain that the human mystery is incredibly demeaned by scientific reductionism, with its claim in promissory materialism to account eventually for all of the spiritual world in terms of patterns of neuronal activity. This belief must be classed as a superstition. […] We have to recognize that we are spiritual beings with souls existing in a spiritual world as well as material beings with bodies and brains existing in a material world.


—Sir John C. Eccles, in his book ‘Evolution of the Brain: Creation of the Self’

17. Current Understandings May Become Obsolete, as So Many Before Them Every experiment destroys some of the knowledge of the system which was obtained by previous experiments.

—Werner Heisenberg, in his book ‘The Physical Principles of The Quantum Theory,’ as translated by Carl Eckart and F. C. Hoyt.

theepochtimes.com



To: koan who wrote (767585)2/3/2014 1:26:55 AM
From: average joe1 Recommendation

Recommended By
i-node

  Respond to of 1575396
 
Scientific Pride and Prejudice

JAN. 31, 2014


Launch media viewer

Olimpia Zagnoli

Gray Matter

By MICHAEL SUK-YOUNG CHWE

SCIENCE is in crisis, just when we need it most. Two years ago, C. Glenn Begley and Lee M. Ellis reported in Nature that they were able to replicate only six out of 53 “landmark” cancer studies. Scientists now worry that many published scientific results are simply not true. The natural sciences often offer themselves as a model to other disciplines. But this time science might look for help to the humanities, and to literary criticism in particular.

A major root of the crisis is selective use of data. Scientists, eager to make striking new claims, focus only on evidence that supports their preconceptions. Psychologists call this “confirmation bias”: We seek out information that confirms what we already believe. “We each begin probably with a little bias,” as Jane Austen writes in “ Persuasion,” “and upon that bias build every circumstance in favor of it.”

Despite the popular belief that anything goes in literary criticism, the field has real standards of scholarly validity. In his 1967 book “Validity in Interpretation,” E. D. Hirsch writes that “an interpretive hypothesis,” about a poem “is ultimately a probability judgment that is supported by evidence.” This is akin to the statistical approach used in the sciences; Mr. Hirsch was strongly influenced by John Maynard Keynes’s “ A Treatise on Probability.”

However, Mr. Hirsch also finds that “every interpreter labors under the handicap of an inevitable circularity: All his internal evidence tends to support his hypothesis because much of it was constituted by his hypothesis.” This is essentially the problem faced by science today. According to Mr. Begley and Mr. Ellis’s report in Nature, some of the nonreproducible “landmark” studies inspired hundreds of new studies that tried to extend the original result without verifying if the original result was true. A claim is not likely to be disproved by an experiment that takes that claim as its starting point. Mr. Hirsch warns about falling “victim to the self-confirmability of interpretations.”

It’s a danger the humanities have long been aware of. In his 1960 book “Truth and Method,” the influential German philosopher Hans-Georg Gadamer argues that an interpreter of a text must first question “the validity — of the fore-meanings dwelling within him.” However, “ this kind of sensitivity involves neither ‘neutrality’ with respect to content nor the extinction of one’s self.” Rather, “the important thing is to be aware of one’s own bias.” To deal with the problem of selective use of data, the scientific community must become self-aware and realize that it has a problem. In literary criticism, the question of how one’s arguments are influenced by one’s prejudgments has been a central methodological issue for decades.

Sometimes prejudgments are hard to resist. In December 2010, for example, NASA-funded researchers, perhaps eager to generate public excitement for new forms of life, reported the existence of a bacterium that used arsenic instead of phosphorus in its DNA. Later, this study was found to have major errors. Even if such influences don’t affect one’s research results, we should at least be able to admit that they are possible.

Austen might say that researchers should emulate Mr. Darcy in “ Pride and Prejudice,” who submits, “I will venture to say that my investigations and decisions are not usually influenced by my hopes and fears.” At least Mr. Darcy acknowledges the possibility that his personal feelings might influence his investigations.

But it would be wrong to say that the ideal scholar is somehow unbiased or dispassionate. In my freshman physics class at Caltech, David Goodstein, who later became vice provost of the university, showed us Robert Millikan’s lab notebooks for his famed 1909 oil drop experiment with Harvey Fletcher, which first established the electric charge of the electron.

The notebooks showed many fits and starts and many “results” that were obviously wrong, but as they progressed, the results got cleaner, and Millikan could not help but include comments such as “Best yet — Beauty — Publish.” In other words, Millikan excluded the data that seemed erroneous and included data that he liked, embracing his own confirmation bias.

Mr. Goodstein’s point was that the textbook “scientific method” of dispassionately testing a hypothesis is not how science really works. We often have a clear idea of what we want the results to be before we run an experiment. We freshman physics students found this a bit hard to take. What Mr. Goodstein was trying to teach us was that science as a lived, human process is different from our preconception of it. He was trying to give us a glimpse of self-understanding, a moment of self-doubt.

When I began to read the novels of Jane Austen, I became convinced that Austen, by placing sophisticated characters in challenging, complex situations, was trying to explicitly analyze how people acted strategically. There was no fancy name for this kind of analysis in Austen’s time, but today we call it game theory. I believe that Austen anticipated the main ideas of game theory by more than a century.

As a game theorist myself, how do I know I am not imposing my own way of thinking on Austen? I present lots of evidence to back up my claim, but I cannot deny my own preconceptions and training. As Mr. Gadamer writes, a researcher “cannot separate in advance the productive prejudices that enable understanding from the prejudices that hinder it.” We all bring different preconceptions to our inquiries, whether about Austen or the electron, and these preconceptions can spur as well as blind us.

Perhaps because of its self-awareness about what Austen would call the “whims and caprices” of human reasoning, the field of psychology has been most aggressive in dealing with doubts about the validity of its research. In an open email in September 2012 to fellow psychologists, the Nobel laureate Daniel Kahneman suggests that “to deal effectively with the doubts you should acknowledge their existence and confront them straight on, because a posture of defiant denial is self-defeating.” Everyone, including natural scientists, social scientists and humanists, could use a little more self-awareness. Understanding science as fundamentally a human process might be necessary to save science itself.

Michael Suk-Young Chwe, a political scientist at U.C.L.A., is the author of “ Jane Austen, Game Theorist.

nytimes.com