Lachlan passed away in January 2010.  As a memorial, this site remains as he left it.
Therefore the information on this site may not be current or accurate and should not be relied upon.
For more information follow this link

(This Webpage Page in No Frames Mode)

Welcome to Lachlan Cranswick's Personal Homepage in Melbourne, Australia

Karl Popper Links and Resources

Lachlan's Homepage is at

[Back to Lachlan's Homepage] | [What's New on Lachlan's Homepage] | [Literature and Books page]

[Intro - CranClan] . . [Happening Things] . . [The Daresbury Laboratory Web Ring of Life] . . [NCS - Non-Competitive Scrabble] . . [Garden Gnomes of Daresbury Laboratory] . . [Nature and Local UK Things] . . [USA 2001 and LDEO Columbia University] . . [Historical Literature/Poetry] . . [Music] . . [Misc Things] . . [DL SRS Status] . . [Conference and Travel Things] . . [The Wonders of Team Building] . . [Other People's Homepages] . . [Crystallographic Internet Front] . . [While in Melbourne] . . [Semi Relevant Links]

Karl Popper

  • Top notch warts and All Obituary of Karl Popper: THE BRITISH ACADEMY: Obituary of Karl Raimund Popper: 1902 - 1994: by John Watkins: London School of Economics: (Published (December 1997) in Proceedings of the British Academy, Volume 94, pp. 645 to 684):

A few reasons why Karl Popper is obscure and unknown while the totalitarian philosophers he debunked (Karl Marx, Hegel, Plato, Aristotle, etc) are still widely known with easily available literature

(An official Lachlan rant)

1. Karl Popper books are hard to find - even if you know to look out for them. Most bookshops do not have any literature by Karl Popper

  • NB: While the Columbia University bookshop in Manhatten, New York City, USA has a good range of Karl Popper books at reasonable prices (May 2001), experience states this is the exception. (Even Blackwells in Oxford lacked "Open Society and it's Enemies" when last I popped by for a look - circa mid 2000).

    Monash University Bookshop in Melbourne, Australia is a complete loss. (Jan to March 2001 - and visits on prior trips back to Melbourne)

2. What Karl Popper books can be found are often obscenely expensive - to the point of inferring/implying that "Karl Popper books were printed to be ignored and left unread" - and the price reflects this.

  • Example from Melbourne Australia:
    From: Rex 
    To: "'Lachlan Cranswick'" []
    Subject: Further this and that
    Date: Tue, 12 Jun 2001 11:10:48 +1000
    Hi Lachlan,
    Borders has just opened a new bookshop in Chadstone, and I was surprised to
    see that they had 8 or 9 books by Popper, including all the better known
    ones plus one on quantum mechanics and philosophy, one on the brain with
    Eccles and others I had not seen before. I was put off buying them because
    of the prices - A$60 -70, which seems excessive for non-recent paperbacks.
    Apart from Popper the shop is not very good, the section on science being
    particularly poor compared with the Chapel St. Borders.

  • From: Rex 
    To: "'Lachlan Cranswick'" []
    Subject: RE: Require clarificiation!
    Date: Tue, 12 Jun 2001 12:01:27 +1000
    >Could you clarify that it is "AU $60 to AU $70" to buy standard Karl
    >Popper books such as the "open society"?
    Hi Lachlan,
    I think that all the books lay between A$55 and A$71, most about $60. The
    price did not relate to the thickness of the books.

3. Karl Popper is still in Copyright - while the totalitarian philosophers he debunked have their writings freely available on the internet - and cheap editions of their works in book stores


4. Many websites describing the work of Karl Popper can be quite obscurist

  • Angebote zu Sir Karl Popper im Internet:

  • Karl Popper Web:

  • Science as Falsification (The following excerpt was originally published in Conjectures and Refutations (1963)). by Karl R. Popper:

  • Critical Rationalism Forum (lots of good info and links on Karl Popper) :

  • Introduction to Karl Popper:

  • HOOVER DIGEST 2000 No. 1: Piers Norris Turner: Remembering Karl Popper
    • At


      In 1989, the fall of the Berlin Wall brought renewed interest in Popperís major contribution to political philosophy, The Open Society and Its Enemies. When The Open Society appeared in England in 1945, Popper was an obscure Austrian philosopher of science living in New Zealand. The book had been rejected by some twenty publishers before Friedrich von Hayek encouraged Routledge to publish it. Popper called the book his "war effort," an attempt to criticize the ideas underlying the twin ideological horrors of fascism and communism. He was concerned that well-meaning people could be induced into believing what he saw as dangerously erroneous doctrines. Although compelled to leave his native Austria in the 1930s (because of his Jewish ancestry), his book is remarkably free from personal bitterness or sadness. It is not a memoir but a philosophical broadside against utopian thinking. Popper challenged the dangerous ideas that, at the time he wrote, seemed poised to engulf the world. He did not shrink from tracing the sources of those dangerous ideas to Marx, to Hegel, and even to that greatest of all philosophers, Plato. At a time when many intellectuals had lost faith in democracy, Popper offered a spirited defense of democratic principles and outlined a compelling vision of a society grounded in democratic reforms.

      Popper was a fallibilist, one who perceives great error and danger in any theory of knowledge - or regime - that claimed to offer certain truth. In such a system, there would be no incentive to establish social and political structures that promote learning or the free exchange of ideas; truth is already at hand. In the name of historical progress, the regime may then justify the squelching of human freedoms and even atrocities on a grand scale. Consequently, Popper fought against those who claimed to know the historical laws of change, a false doctrine Popper called historicism. Historicist prophecies were a threat to the open society, and, indeed, both nazism and Soviet-style totalitarianism alike produced unimaginable horrors.

      Despite its success in articulating the inherent threat of Marxism, Popperís book is not about Soviet Russia or conceived of as a Cold War tome. In fact, Popper developed his ideas just before World War II, in a radically different geopolitical landscape. Yet, soon after it appeared, The Open Society was denounced by philosophy professors for its irreverent exposition of the authoritarian tendencies in Plato and Marx. Other intellectuals were dismissive, not surprisingly since many were for too long blind to the failures of Soviet communism and indignant at any comparison of Marxism with fascism. Nevertheless, Popperís Open Society has always had a wide readership and influential champions on both the left and the right. Isaiah Berlin wrote in 1963 that Popperís Open Society contained "the most scrupulous and formidable criticism of the philosophical and historical doctrines of Marxism by any living writer." National Review recently ranked the book number six on its list of the hundred most important nonfiction works of the century. George Soros, who first encountered The Open Society as Popperís student at the London School of Economics, founded the Open Society Institute to propagate Popperís ideas, particularly in Eastern Europe. Thus the political philosophy Popper first articulated before the start of the Cold War is now being studied and put into practice in countries newly emerging from behind the Iron Curtain.

  • The Karl Popper Institute (1997 -)
    • At

    • "Founded at the explicit wish of Sir Karl's legal heirs and executors Ms. Melitta and Mr. Raymond Mew to establish a research unit in his native city. The Institute's policy is one of promotion, critical analysis and further development of Sir Karl's philosophy and world views. Far reaching goal is to make general public and foremost youth familiar with his scientific and human ideals of tolerance, modesty and intellectual sincerity"

  • Scanned letter from Karl Popper Web (about Leonard Nelson):

  • Karl Popper correspondance with Doctor Thomas Szasz:
    • An extract:
      (2)  Role Playing and Game playing.  I shall make
      only quite dogmatic remarks.  Role playing is for
      those who do not dare to be what they are.  It is
      itself already a shoddy and dangerous substitute for
      genuine learning, that is, for genuinely changing
      oneself to become more nearly what one wants to be.
      This learning new roles is not the kind of learning
      which is really desirable, and an end in itself.
           Learning a new role has only an instrumental
      value - for survival.  But none of us survives long;
      and instrumental values are not enough.  Learning - as
      opposed to learning a new role - and growing up, until
      we die, is, or can be, a value in itself.  To perform
      constantly the miracle of lifting oneself out of the
      swamp by one's own shoelaces is, indeed, a purpose.

  • The Thomas S. Szasz Cybercenter for Liberty and Responsibility (author of "The Myth of Mental Illness"):

  • Friedrich A. Hayek (1899-1992) - with some gossip about Karl Popper ("Hayek's personal and professional relationship with Popper, whom he helped in his career, was somewhat ironic considering that Hayek was a cousin of the philosopher Ludwig Wittgenstein (1889-1951), with whom Popper disagreed on almost every conceivable philosophical issue. Popper's critique of Wittgenstein's Tractatus, in the footnotes to The Open Society and Its Enemies, was devastating; and after World War II there was a famous incident when Wittgenstein furiously stalked out of a talk by Popper. Popper was told that he had been the first person to interrupt Wittgenstein the way Wittgenstein interrupted everyone else"):

  • Top notch warts and All Obituary of Karl Popper: THE BRITISH ACADEMY: Obituary of Karl Raimund Popper: 1902 - 1994: by John Watkins: London School of Economics: (Published (December 1997) in Proceedings of the British Academy, Volume 94, pp. 645 to 684):

  • Popper's "falsificationist philosophy of science" : Obituary of Karl Popper, 1902 - 1994, by John Watkins:

  • "Popper sought to solve his two basic problems at one blow with his falsificationist philosophy of science. What demarcates science from non-science (metaphysics, logic, mathematics, and pseudo-science) is not the verifiability but the falsifiability of its theories. The method of science also is not inductive; it does not start out from observations and generalise from them: it starts out from problems, which it attacks with bold conjectures. The latter are unverifiable and unjustifiable but, when well developed, have predictive implications which can be put to the test, the more severe the better. A test will be severe if made with sufficiently discriminating experimental apparatus on predictions which deviate (as Einsteinís did from Newtonís) to a small but detectable extent from unrefuted predictions of the previous theory, or if made on predictions which break new ground. On this view, scientific inferences are all deductive, either from conjectural premises to a falsifiable consequence or from a falsified consequence to the negation of the conjunction of the jointly responsible premises. The problem of induction therefore drops out. (Whether it drops out completely is a question which will come up again.)"

  • New Zealand: Obituary of Karl Popper, 1902 - 1994, by John Watkins:

  • "In his first year there he gave at a seminar what became a stunning little piece called 'What Is Dialectic?' (he was not yet banning what-is questions). Against dialecticians who say that contradictions are welcome because they are fertile, it declared that they are fertile only so long as we strive to eliminate them."

  • G"ombrich and Hayek tried to find a publisher, but without success at first. Cambridge University Press turned it down, and there were several more rebuffs. As well as Plato-reverence, there was, in those days of paper rationing, dismay at the book's size; and some admirers of the book, such as Robbins, felt that the critique of Marx was too long and heavy. Hayek eventually turned to Routledge, who were publishing his own The Road to Serfdom. There Herbert Read read it and was enormously impressed. It was at last accepted."

  • LSE: Wittgenstein storms out: Obituary of Karl Popper, 1902 - 1994, by John Watkins:

  • "I will reconstruct what happened from the various sources as best I can. The meeting was in Braithwaite's room in King's College. Wittgenstein, who chaired the meeting, sat on one side of an open fire and Popper on the other. Russell was in a high-backed rocking-chair. Others present included Elizabeth Anscombe, Richard Braithwaite, C. D. Broad, A. C. Ewing, Peter Geach, Norman Malcolm, Margaret Masterman, Stephen Toulmin, and John Wisdom (A. J. T. D., not J. O.). There were also various students. The secretary's invitation to Popper had said that 'short papers, or a few opening remarks stating some philosophical puzzle, tend as a rule to produce better discussions than long and elaborate papers'. The minutes say that Popper began by expressing astonishment at the Secretary's letter of invitation (a footnote explains that this is the Club's form of invitation). Wittgenstein seems to have mistaken Popper's opening remarks for a complaint against the Secretary, and sprang to his defence. But Popper was taking the wording of the invitation as expressing the Wittgensteinian thesis that there are no genuine philosophical problems, only puzzles; and he set out to counter this thesis by bringing forward some real problems. One concerned induction. Wittgenstein dismissed this as a merely logical problem. Another concerned the question of actual (as distinct from merely potential) infinities. (One of the two theses in Kant's first antinomy says that the world must have had a beginning in time, otherwise an actual or completed infinity of time will have elapsed. Popper rebutted this many years later.) Wittgenstein dismissed this as a mathematical problem. As his last example, Popper gave the question of the validity of moral rules. Wittgenstein, who had hold of the poker and was waving it about a good deal, demanded an example of a moral rule, to which Popper replied: 'Not to threaten visiting lecturers with pokers'. There was laughter, and Wittgenstein stormed out, angrily declaring as he went that Popper was confusing the issues; whereupon Russell called out, 'Wittgenstein, you're the one who's causing the confusion'."

  • LSE: Popper Lecturing in the USA: Obituary of Karl Popper, 1902 - 1994, by John Watkins:

  • "This was an exhilarating time for him. He had now accepted an invitation from Harvard University to give the William James Lectures in 1950 (John Dewey and Bertrand Russell were among previous lecturers). He was paid 'Hollywood rates', as he put it (ten lectures at $600 per lecture). His title was, 'The Study of Nature and of Society'. It was his first visit to America. He found it 'a marvellous country' and was full of enthusiasm for things American. For instance, he said that while Negroes were admittedly a depressed class, they were the least depressed depressed class in the world. He found some of the work of Harvard graduate students 'really outstanding' (perhaps he had Tom Kuhn in mind). He visited other Ivy League universities. He gave the Woodward Lecture at Yale (I have described my experience of this occasion elsewhere), and at Princeton he gave a seminar talk on indeterminism with both Einstein and Bohr in the audience! It seems that Bohr rather took over the discussion, and then went on and on; six hours after the meeting started the room contained just him at the blackboard with Einstein and Popper as his audience. 'He's mad', Einstein whispered. In Unended Quest Popper described how he met Einstein three times, at the latter's request, and tried to argue him out of determinism. A good many years later, when Bartley was editing the Postscript, Popper drew his attention to evidence suggesting that his (Popper's) arguments may have had some influence on Einstein."

  • Criticisms of the Marxist approach:
  • "The criticisms referred to above are primarily criticisms of the application of the Marxist approach, rather than of Marxism per se. Naturally, Marxism, like any other political philosophy, has had its opponents ever since Marx first formulated it. Foremost amongst such critics is the philosopher Sir Karl Popper and any reader interested in an attack on the very fundamentals of Marxism is referred to his Poverty of Historicism and The Open Society and its Enemies. Critics of Marxism have traditionally had little impact on Marxists, since, as Peter Medawar points out:
    ... just as any criticism of psychoanalysis is construed as an infirmity of the psyche which itself requires psychoanalytic treatment, so criticism of an essentially Marxist theory is thought to reveal its author as yet another victim and dupe of the very socio-economic forces he has presumed to question.
    - Medawar (1996)"

  • "The Capitalist Threat" by George Soros: "What kind of society do we want? "Let the free market decide!" is the often-heard response. That response, a prominent capitalist argues, undermines the very values on which open and democratic societies depend" :
  • "In The Philosophy of History, Hegel discerned a disturbing historical pattern -- the crack and fall of civilizations owing to a morbid intensification of their own first principles. "
  • "To fulfill this role, the concept of the open society needs to be redefined. Instead of there being a dichotomy between open and closed, I see the open society as occupying a middle ground, where the rights of the individual are safeguarded but where there are some shared values that hold society together. This middle ground is threatened from all sides. At one extreme, communist and nationalist doctrines would lead to state domination. At the other extreme, laissez-faire capitalism would lead to great instability and eventual breakdown. There are other variants. Lee Kuan Yew, of Singapore, proposes a so-called Asian model that combines a market economy with a repressive state. In many parts of the world control of the state is so closely associated with the creation of private wealth that one might speak of robber capitalism, or the "gangster state," as a new threat to the open society."

  • The Free Cuba Foundation is a non-profit, and non-partisan organization whose purpose is to work towards the establishment of an independent and democratic Cuba using non-violent means.:

  • The case for the "open society" and the role the WTO plays - WTO NEWS: SPEECHES - DG MIKE MOORE (Australia-Israel Chamber of Commerce):

  • Chronicles of Love and Resentment - Eric Gans - The Post-Millennial Age:

  • Geoffrey Sampson Homepage:

  • Educating Eve -

  • Scientific Censorship and Evolution :

    "In his "The Open Society and its Enemies" Sir Karl Popper says that the great value of the scientific method is that it saves us from "The tyranny of opinion". If neo-Darwinists can counter the evidence I present, let them do so. If they seek to prevent my writing being published because they don't like it, then it is not just I that fall victim to the "tyranny of opinion", it is all of us. "

  • Harzem Presents Papers on Rise of Hitler and Stalin and on Open Society and its Enemies

    • KARL POPPER 2002 CENTENARY CONGRESS - Vienna, 3 July to 7 July 2002 - Venue: Rathaus & University of Vienna

      • The beauty of a world without privacy (Noted science-fiction author David Brin -- winner of both the Hugo and Nebula Awards for outstanding speculative fiction -- says that privacy could be the ultimate danger to our society. ) :
        "ZDNN: How did you form this idea of transparency?
        Brin: It is an outgrowth of the Karl Popper's "The Open Society and its Enemies" that helped us win the Cold War and retain our freedom back around 1948. He said many of the same things, except in a more complicated way -- it was less accessible. "

      • "The Transparent Society: Will Technology Force us to Choose Between Privacy and Freedom?" by David Brin, Ph.D.:
        "The late Karl Popper pointed out the importance of this mythology in the dark days during and after the Second World War, in The Open Society and its Enemies. Only by insisting on accountability, he concluded, can we constantly remind public servants that they are servants. It is also how we maintain some confidence that merchants aren't cheating us, or that factories aren't poisoning the water. As inefficient and irascibly noisy as it seems at times, this habit of questioning authority ensures freedom far better than any of the older social systems that were based on reverence or trust. "

    • Robert A.J. Matthews: Facts versus Factions: the use and abuse of subjectivity in scientific research (Bayesian inference / Bayesian statistics / - Baysian Basian - getting the mis-spellings)
      • At

      • "Summary:

        This paper explores the use and abuse of subjectivity in science, and the ways in which the scientific community has attempted to explain away its curiously persistent presence in the research process. This disingenuousness is shown to be not only unconvincing but also unnecessary, as the axioms of probability reveal subjectivity to be a mathematically ineluctable feature of the quest for knowledge. As such, concealing or explaining away its presence in research makes no more sense than concealing or explaining away uncertainty in quantum theory. The need to acknowledge the ineluctability of subjectivity transcends issues of intellectual honesty, however. It has profound implications for the assessment of new scientific claims, requiring that their inherent plausibility be taken explicitly into account. Yet as I show, the statistical methods currently used throughout the scientific community lack this crucial feature. As such, they grossly exaggerate both the size of implausible effects and their statistical significance, and lend misleading support to entirely spurious "discoveries". These fundamental flaws in conventional statistical methods have long been recognised within the statistics community, but repeated warnings about their implications have had little impact on the practices of working scientists. The result has been an ever-growing number of spurious claims in fields ranging from the paranormal to cancer epidemiology, and continuing disappointment as supposed "breakthroughs" fail to live up to expectations. The failure of the scientific community to take decisive action over the flaws in standard statistical methods, and the resulting waste of resources spent on futile attempts to replicate claims based on them, constitutes a major scientific scandal. "

      • Robert Millikan is widely regarded as one of the founders of modern American science, his determination of the charge on the electron winning him the 1923 Nobel Prize for physics. In a now-famous study, the physicist and historian Gerald Holton examined the log-books for Millikanís experiments with the electron, and revealed that he repeatedly rejected data that he deemed "unacceptable" (Holton 1978). The criteria he used were blatantly subjective, as revealed by the comments in the log-books, such as "Very low - something wrong" and "This is almost exactly right". Throughout, Millikan appears to have been driven partly by a desire to get results that were self-consistent, broadly in agreement with other methods, and consistent with his personal view that the electron is the fundamental and indivisible unit of electric charge.

        While these criteria may seem reasonable enough, they carry inherent dangers. Even today a fundamental explanation of the precise numerical value of the charge on the electron remains lacking, so Millikan was hardly in a position to decide objectively which values were high and which ones low. Previous results may have been fundamentally flawed, while the demand for self-consistent results may mask the existence of subtle but genuine properties of the electron. Millikan could also have been proved wrong in his belief that the electron was fundamental.

        However, it is also clear that Millikan had another powerful motivation for using all means to obtain a convincing determination of the electronic charge: he was in a race against another researcher, Felix Ehrenhaft at the University of Vienna. Ehrenhaft had obtained similar results to those of Millikan, but they were interspersed with much lower values that suggested that the electron was not, in fact, the fundamental unit of charge. Millikan had no such doubts, published his results, and went on to win the Nobel Prize.

      • Apologists for Millikanís hand-picking of data also point out that the numerical result he obtained, - 1.592 x 10-19 coulombs, is just 0.6 per cent below the modern value of - 1.6021892 x 10-19 C (Weinberg 1993 p 99). At first sight, this does indeed seem impressive. However, Millikanís stated result was based on a faulty value for the viscosity of air, which when corrected changes Millikanís result to - 1.616 x 10-19 C, increasing the discrepancy with the modern value by over 40 per cent. More importantly, however, it puts the latter well outside the error-bounds of Millikanís central estimate. Indeed, the discrepancy is so large that the probability of generating it by chance alone is less than 1 in 103. Millikanís "remarkable ability" to scent out the correct answer was clearly not as great as his apologists would have us believe. Rather more remarkable is Millikanís ability, almost half a century after his death, to evade recognition as an insouciant scientific fraudster who won the Nobel Prize by deception.

        The dangers of the injudicious use of subjective criteria is further highlighted by the aftermath of Millikanís experiments. In the decades following his work and Nobel Prize, other investigators made determinations of the electronic charge. The values they obtained show a curious trend, creeping further and further away from Millikanís "canonical" value, until finally settling down at the modern figure with which, as we have seen, it is wholly incompatible. Why was this figure not reached sooner ? The Nobel Prizewinning physicist Richard Feynman has given the answer in his own inimitable style (Feynman 1988, p 382):

        "Itís apparent that people did things like this: when they got a number that was too high above Millikanís, they thought something was wrong - and they would look for and find a reason why something might be wrong. When they got a number closer to Millikanís value they didnít look so hard. And so they eliminated the numbers that were too far off"

        • Semmelweissís long and unsuccessful struggle during the 1840s to introduce antiseptic practices into hospitals (Asimov 1975 p 348). Despite the existence of a dramatic fall in the numbers of cases of childbed fever produced by the use of antiseptics, the practice was rejected because of resentment by the doctors that they could be causing so many deaths, nationalistic prejudice against a Hungarian working in a Viennese hospital, and annoyance at the way the antiseptics eliminated the "professional odour" on their hands after returning to the wards from working in the mortuary.
        • The rejection and ridiculing of Francis Peyton Rousís evidence for the existence of viruses capable of transmitting cancer (Williams 1994, p422). First put forward in 1911, Rousís evidence came at a time when the existence of viruses was still controversial - they were beyond the reach of contemporary microscopy - and when cancer was thought to be caused by "tissue irritation". Rousís claim was finally vindicated 25 years later. In 1966 he was awarded the Nobel Prize - at the age of 87.
        • The vociferous response of geologists to the proposal by Alfred Wegener, a German astronomer and meteorologist, that the continents moved across the face of the Earth. Having found considerable evidence for the phenomenon, but unable to propose a physical mechanism for it, Wegenerís proposal was dismissed as a "fairy tale", the product of "auto-intoxication in which the subjective idea comes to be considered as an objective fact" (Hellman 1998 p150). His claims were subsequently vindicated in the 1960s, 50 years after he first proposed them, and 30 years after his death.
        • In the early 1980s, the Australian physician Barry Marshall encountered derision and hostility for his claim that a previously unknown bacterium, Helicobacter pylori, was responsible for stomach ulcers. Marshallís evidence went against the prevailing view that bacteria were incapable of thriving within the acidic conditions of the stomach. H. pylori is now accepted as the principal cause of stomach ulcers, and has also been implicated in gastric cancer.

    • Subjectivity in the testing of theories

      The value of any scientific theory, no matter how theoretically elegant or plausible, is ultimately tested by experiment. Conventionally, this crucial element of the scientific process involves extracting a clear and unequivocal prediction from the theory, investigating this prediction experimentally, and assessing the outcome objectively. Exactly how this comparison is performed, and what conclusions are drawn, has long been a subject of debate among scientists and philosophers. Many scientists consider themselves to be followers of Karl Popper and the concept of falsifiability (Popper 1963): that to be considered scientific, a theory must be capable of being proved wrong. On this view, the experiment and the analysis of data should be performed to discover if the theory is falsified, and if it is, it must be abandoned. As such, theories are never proved "correct": they merely survive until the next experimental attempt at falsification. There are a great many fundamental problems with Popperís widely-held - and admittedly appealing - view of the scientific process (see especially Howson & Urbach 1993). Put simply, these problems boil down to the fact that the concept of falsification is supported neither in principle nor in practice. Over 90 years ago the French physicist and philosopher Pierre Duhem pointed out that the testable consequences of scientific theories are not a pure reflection of the theory itself, but are based on many extra assumptions. As a result, if an experiment appears to falsify a theory, this does not automatically imply that the theory must be false: it is always possible to blame one of the auxiliary assumptions.

    • This "curious" fact, combined with the many problems and pitfalls associated with frequentist measures of "significance", raises an obvious question: is there a better way? As I now show, the answer is yes.

    • Facts versus Factions: the use and abuse of subjectivity in scientific research - PART 2 :

    • The classical frequentist techniques of inference are not, in fact, "classical" at all, but relative newcomers in the long history of statistical inference. Before the 1920s, another approach to statistical inference was in general use, based on a result that flows directly from the axioms of probability. As such, this approach has solid theoretical foundations, produces intuitive, readily-understood measures of "significance", and remains as valid today as it did before it was eclipsed by the flawed attempts of Fisher et al. to create an objective theory of statistical inference. It is known as Bayesian inference, after the 18th Century English cleric Thomas Bayes who first published the key theorem behind it: Bayesís theorem.

      The power and importance of this theorem is immediately apparent in its solution to one of the central problems of standard statistical inference. As we have seen, frequentist methods do not tell us Prob(theory | data); that is, they do not tell us what our belief in a theory should be, given the data we actually saw. To answer that question, we must turn to the axioms of probability theory, from which we find that (see, e.g. Feller 1968 Ch 5):

    • In short, Bayesian inference provides a coherent, comprehensive and strikingly intuitive alternative to the flawed frequentist methods of statistical inference. It leads to results that are more easily interpreted, more useful, and which more accurately reflect the way science actually proceeds. It is, moreover, unique in its ability to deal explicitly and reliably with the provably ineluctable presence of subjectivity in science.

      • For over 250 years, Princeton students have attended Commencement on a Tuesday in late May or early June, an outdoor event for which good weather is vital. According to local folklore, good weather does usually prevail, prompting claims that those attending may "wish" good weather into existence. By analysing local weather records spanning many decades, Nelson found that Princetonís weather was generally no different from that of its surroundings. However, he did find some evidence that the town was less likely to be rained on during the outdoor events. The phenomenon gave z-scores as high as 1.996, which on a frequentist basis gives a "significant" P-value of 0.046. Properly mindful of the implausibility of the phenomenon, however, Nelson was reluctant to take this "objective" finding at face value, and instead reached a more subjective conclusion: "These intriguing results certainly arenít strong enough to compel belief, but the case presents a very challenging possibility".

        A Bayesian analysis allows a far more concrete assessment of plausibility to be made. Clearly, with such a bizarre claim, there is little one can say about the precise value of a sensible prior probability for the null hypothesis of no real effect, other than to say that the probability is likely to be pretty high. In such cases, Bayesian inference still gives valuable insight, as it allows one to estimate the level of prior probability necessary to sustain a belief that the effect is illusory, even in the light of Nelsonís data. Using (4) and (3) and z = 1.996, this inverse Bayesian inference shows that Prob(Null | data) > 0.5 for all Pr(Null) > 0.88 In other words, for anyone whose prior scepticism about the effectiveness of "wishful thinking" exceeds 90 per cent, the balance of probabilities is that the effect is illusory, despite Nelsonís data.

        As this example shows, frequentist methods greatly exaggerate the "significance" of intrinsically implausible data. However, as we shall now see, frequentist methods can also seriously exaggerate both the size and significance of effects in much more important mainstream areas of research, such as clinical trials.

      • Misleading "significance" of clinical trial results Misleading P-values

        The most common methods for investigating the efficacy of a new drug or therapy, or the impact of exposure to some risk-factor, are the so-called randomised clinical trials and case-control studies, in which a group of people given the new treatment or known to have the disease are compared with a "control" group. One common frequentist method of analysing the outcome is to reduce the results to a test-statistic (such as c2 ), which is then turned into a P-value; as before, if this is less than 0.05, then the difference between the two groups is deemed to be significant. Again, however, a Bayesian analysis reveals that the real "significance" of such a finding is typically much less impressive than the P-values imply.

        As before, I shall demonstrate this by taking a real-life case. During the early 1990s, research emerged to suggest that the risk of coronary heart disease (CHD) is associated with childhood poverty (Elford et al. 1991). Following the discovery that infection with the bacterium H. pylori is also linked to poverty, some researchers suspected that the bacterium may form the "missing link" between the two. Precisely how a bacterium in the stomach might cause heart disease is less than clear - raising the key issue of plausibility, to which we shall return shortly. Nevertheless, a number of studies were undertaken to investigate the link between CHD and H. pylori. In one of the first such studies (Mendall et al. 1994), 60 per cent of patients who suffered CHD were found to be infected with H. pylori, compared with 39 per cent of normal controls. When the effects of age, CHD risk factors and current social class had been controlled for, the results led to a c2 value of 4.73. Using frequentist methods, this leads to a P-value of 0.03, implying that the rate of CHD among those infected with H. pylori is "significantly" higher than those without.

        On the face of it, this finding raises the intriguing prospect of being able to tackle one of the major killers of the western world using nothing more than antibiotics. Yet while the evidence that both CHD and H. pylori infection are more common among the poor is suggestive of a link between the two, it is hardly unequivocal. Such scepticism is underscored by the lack of any convincing mechanism by which a gastric bacterium could trigger heart disease. The frequentist P-value, however, cannot reflect any of these justifiable qualms; sceptics of the link have no option but to say that on this occasion they are just going to ignore the supposed "significance" of Mendall et al. ís finding.

        In contrast, Bayesian inference requires no such arbitrary "moving of the goalposts": it allows explicit account to be taken of the plausibility of the findings. In the case of the supposed link between CHD to H. pylori, the lack of any convincing mechanism balanced against the socio-economic evidence of a link suggests that an agnostic prior probability of Prob(Null) = 0.5 would be a reasonable starting-point for assessing results like those found by Mendall et al. .

        [TEXT DELETED]

        Inserting the value of c 2 = 4.73 found by Mendall et al. into (6) shows that the BF is at least 0.337. Putting this in (6) we find that Prob(Null | data), the probability that Mendall et al.ís results are due to nothing more than chance is at least 0.25. In other words, even using an agnostic prior, the frequentist P-value has over-estimated the real "significance" of the findings by almost an order of magnitude.

        Those taking a more sceptical view of a link between a gastric bacterium and CHD would, of course, set Prob(Null) somewhat higher. Applying the concept of inverse Bayesian inference used earlier, it emerges that even a relatively modest sceptical prior of just Prob(Null) = 0.75 is enough to lead to a balance of probabilities that Mendall et al.ís findings are entirely illusory.

    • Reputable researchers would no doubt feel more confident defending evidence for an anomalous phenomenon by applying at least a mild level of scepticism in their assessment of significance. In this case, a P-value of no more than around 2x10-4 is appropriate, a value 250 times more demanding than the conventional 0.05 criterion. These technical results can be stated much more succinctly, however: extraordinary claims require extraordinary evidence. This is a well-attested and widely-accepted principle, yet it is noticeable by its absence in the mathematics of frequentist inference.

    • There is a dangerous irony in the continuing reluctance of the scientific community to adopt Bayesian inference. For this reluctance stems largely from a deep-rooted fear that adopting methods that embrace subjectivity is tantamount to conceding that the scientific enterprise really is a social construct, as claimed by the post-modern advocates of the "anti-science" movement. The central lesson of Bayesís theorem is, however, quite the opposite. It shows, with full mathematical rigour, that while evidence for a specific theory may indeed start out vague and subjective, the accumulation of data progressively drives the evidence towards a single, objective reality about which all can agree.

      It is ironic indeed that by failing to recognise this, the scientific community continues to use techniques of inference whose unreliability undermines confidence in the scientific process, and which thus threatens to deliver science into the hands of its enemies.

    • Why are coincidences so impressive ? : Perceptual and Motor Skills 80 1121-1122 1995 by Robert A.J. Matthews and Sue Blackmore (Bayesian inference / Bayesian statistics / - Baysian Basian - getting the mis-spellings)
      • At
      • "Using probability theory to explore the reasons why people reach for paranormal explanations of "spooky" coincidences and to predict real-life coincidences"
      • "Many people are surprised when they come across coincidences such as meeting someone at a party with whom they share a trait (e.g. same birthday). It is well-known that according to probability theory the number of people needed to give a reasonable chance of encountering such coincidences is much smaller than one might think (e.g. just 23 are needed to give 50:50 odds of at least two sharing the same birthday). This suggests that most people use an unreliable intuitive rule for gauging the likelihood of coincidences. In an earlier paper , I speculated that people may use a rule of thumb in which the number of people required to bring about coincidences varies linearly with the "outlandishness" of the coincidence (as gauged by the number of groups into which each can be pigeonholed, e.g. 365 for birthdays). As probability theory shows that the number required actually varies only as the square-root of the number of groups, this would lead to the intuitive rule progressively over-estimating the number of people needed to bring a coincidence about - and thus to our being increasingly amazed by "outlandish" coincidences. By giving 124 students a questionnaire asking them to estimate the number of people needed to bring about a range of such coincidences, Sue and I found strong evidence that a linear scaling rule does seem to be used by people. This may help explain why so many feel the need to invoke "spooky" explanations for coincidences: they are being let down by they simplistic intuitive rules for judging the probability of coincidences. "

    • Coincidences: the truth is out there by Robert A.J. Matthews
      • At
      • "Using probability theory to explore the reasons why people reach for paranormal explanations of "spooky" coincidences and to predict real-life coincidences"
      • "Coincidences often surprise us, suddenly springing up to reveal an unexpected connection between people or things. Sometimes they seem so outlandish as to demand a "supernatural" explanation. Yet anyone familiar with probability theory knows how this notoriously counter-intuitive branch of mathematics can spring big surprises on us. One of the most famous and relevant examples is the so-called Birthday Paradox, which states that in a random gathering of just 23 people, there are 50:50 odds that at least two of those present have the same birthday. Many people find this result very surprising: a recent survey of university students found a median value for the estimated size of gathering needed of 385. (Matthews & Blackmore 1995). So large a gathering is, of course, guaranteed to contain at least one coincident birthday, suggesting that the probabilistic aspects of the paradox evade many people. The same study also showed that people tend to grossly overestimate the size of gathering needed for other types of coincidence. "

    • Murphy's Law of Maps (Teaching Statistics 19 34-35 1997) by Robert A.J. Matthews
      • At
      • "Analysis of urban myths, such as manifestations of Murphy's Law - "If it can go wrong, it will". Published results cover the notorious examples of tumbling toast landing butter-side down, why there are so many odd socks in our drawers, why rope, string or flex so often seems to acquire knots , and why places you're looking for so often lie in awkward places on maps ."
      • CONCLUSION: "Scientists are often quick to dismiss popular beliefs like Murphy's Law as nothing more than "urban myths". However, it's often worth pausing to wonder precisely why so many people believe in a particular phenomenon. Are they really all dunderheads who just forget all the times the phenomenon does not occur ? Or might there be some deeper explanation, for example based on counter-intuitive probabilistic arguments ? In the case of Murphy's Law of Maps, there's a particularly simple explanation for why people think map locations tend to lie in awkward places. They do."

    • The Interrogator's Fallacy (Bulletin of the Institute of Mathematics and its Applications 31 3-5 1994) by Robert A.J. Matthews
      • At

      • "Confessional evidence has a long, controversial and notorious history. Past cultures have regarded it as the ultimate form of evidence, requiring no corroboration. It has also been frequently associated with grievous abuse of power. In modern times, the extraction of confessions under duress has been viewed with considerable scepticism, and is now inadmissible in English law. Yet while many defence lawyers are sceptical of the evidential value of confessions, the police and judiciary generally remain convinced of their importance. Confessions play a significant role in an estimate 20 per cent of all cases. In this paper, I show that a Bayesian analysis of confessional evidence supports those who adopt the sceptical view. A potentially dangerous counter-intuitive situation can arise unless confessional evidence is assessed by a jury in the appropriate way. Specifically, for a confession to contribute to the evidence of guilt, the prosecution must prove that the probability that an innocent person would confess to the crime is undoubtedly less than the probability that a guilty person would confess under the same conditions."

      • "How can such a fallacy arise ? Recent psychological research by Gudjonsson suggests that for certain types of cases, the above inequality may indeed be reversed. He has shown that the psychological traits of interrogative suggestibility and compliance are important in determining how individuals cope with interrogation. Essentially, those with high scores for these two traits are less able to resist interrogation. Interrogative suggestibility has, moreover, been found to correlate with a number of cognitive and personality metrics, including reduced IQ and lack of assertiveness, while compliance appears to be linked to avoidance of conflict and obedience to authority. Given such correlates, the perpetrators of anti-establishment crimes, most notably terrorism, are clearly unlikely to score highly for suggestibility and compliance. As such, terrorism cases appear to run a considerable risk of a reversal of the inequality given above, and thus to the counter-intuitive situation of a confession of guilt actually contributing to evidence of innocence. It is interesting to note that many of the most notorious miscarriages of justice do indeed centre on terrorist cases in which confessional evidence played a key role."

      • Confessions with corroborative evidence : "To take a concrete example, A could be a confession of murder, and B the discovery of a body resulting from the confession; in this case the key inequality is clearly likely to be met, and the probability of guilt has clearly been boosted by the new evidence. However, if A is evidence that could be planted on a suspect by police, and B is the resulting confession, then the inequality does not necessarily hold: an innocent person can be persuaded to confess if they become sufficiently confused or brow-beaten by the police.

        In conclusion, confessional evidence remains fraught with danger, even when supported by corroborative evidence. "

    • Why should clinicians care about Bayesian methods ? by Robert A.J. Matthews (Bayesian inference / Bayesian statistics)
      • At
      • Five reasons to fret about frequentist methods
        1. Frequentist methods are easily misinterpreted
        2. Frequentist methods are arbitrary

          Even the most statistically apathetic must have wondered precisely why a p-value of 0.049 is deemed "statistically significant", while one of 0.051 is not. As Jeffreys (1939) emphasises, the 0.05 cut-off was chosen by R A Fisher because of a handy mathematical coincidence: a conveniently low percentage of the total area under the Normal curve - 5 per cent - lies beyond a conveniently (almost) round number of standard deviations either side of the mean: 1.96.

          Arbitrary or not, the 0.05 criterion has proved extraordinarily resilient in the face of attempts to excise it from the theory of statistical inference. In these days of powerful PCs and statistics software, this resilience can hardly be attributed to computational convenience. A more plausible explanation is that the 0.05 criterion does seem to give a clear-cut, standardised and reasonable point of reference for the otherwise seemingly hopelessly subjective task of gauging "significance" - and clinicians (like most people) like clear-cut answers. Furthermore, one can once again argue that there can't be that much wrong with the 0.05 criterion, as it has been used for decades without the scientific sky falling in. Thus this second supposedly grave flaw in frequentist methods can all too easily be dismissed as of no practical importance for the working clinician.

        3. Frequentist methods exaggerate significance
        4. Frequentist methods fail worst when you need them most
        5. Frequentist methods are a poor basis for regulatory assessment

      • Confidence intervals: better than p-values ?
      • What is required to change old habits ?
      • Credibility: a suitable case for Bayesian treatment
      • Derivation of the Critical Prior Interval (CPI)
      • Credibility assessment: a worked example

    • "Naturalism is an essential part of science and critical inquiry" by Steven D. Schafersman

      • "The surest sign that postmodernism is wrong is that postmodern critiques of science have had absolutely no effect on the practice of science or the continuing achievements of science. If there had been any truth at all to postmodernism, scientists would have changed their scientific methods and procedures to try to escape the postmodern pitfalls of relativism, subjectivism, and externalism. The fact that few scientists know or care about postmodernism, and none have been influenced by it, speaks volumes. (Karl Popper's neopositivistic falsificationism, on the other hand, has--for better or worse--been remarkably influential.) Some scientists have taken the time to understand postmodernism and, in their minds, it is easy to refute (Gross and Levitt, 1994; Gross, Levitt, and Lewis, 1996). Although some philosophers probably don't believe this, scientists are and have long been aware of the minefields of relativism, subjectivism, and externalism, and they firmly believe that the scientific method, as it is now and as it has long been practiced, eliminates--not just minimizes--these problems. (Admittedly, this process of elimination is historical and may take years, decades, or even centuries!) I have long believed this myself, and now that I have become acquainted with the postmodern critiques of science, I still believe it. Avoiding relativism, subjectivism, and externalism is what scientists learn in science school. The end result of the scientific method is the most reliable knowledge that humans can possess, although not necessarily ultimate, absolutely true knowledge (whatever that means), but that is good enough and better than the alternative."

      • "But I maintain that there is at least one criterion of legitimate science that correctly identifies scientific creationism and all forms of supernatural explanation in science as pseudoscience. This is the criterion of testability. It dates from the beginning of the nineteenth century when scientists began to explicitly eschew supernatural explanations, and it was quickly recognized and identified in the work of the first philosopher of science, John Herschel, who is responsible for first explicating the hypothetico-deductive method of science. It is now commonly accepted, for example, that Charles Darwin deliberately used Herschel's characterization of correct scientific method in his effort to establish the fact and his theory of evolution in 1859 (Ghiselin, 1969). In the twentieth century, Karl Popper championed and extended this same idea in his work on prediction, deduction, testing, and falsification in science. I am completely aware that Popper's solution to the demarcation problem is incomplete. (How would one apply it in the continuing controversy over the reality of ESP, for example, when it is the very predictions, tests, and statistical degree of falsification that is under controversy?) I maintain, however, that the criterion of testability or falsifiability is a necessary but not sufficient solution to the demarcation problem, and while I admit it cannot distinguish science from pseudoscience in all areas of interest, the "necessity of being susceptible to falsification" criterion is quite capable of eliminating all supernatural explanations, such as creationism and intelligent design, from legitimate science--because such explanations cannot be falsified."

      • "Larry Laudan (three papers reprinted in Ruse, 1996a) has contempt for the testability criterion, claiming that many pseudosciences would be scientific under this criterion because they are, in principle, falsifiable, and their claims have been falsified. He puts scientific creationism in this category. I am not sure if Laudan is as familiar with the creationist literature as Michael Ruse and I, but he is wrong. I agree with Laudan that when pseudosciences like scientific creationism make statements about the natural realm, such as a 6000-year old Earth or a specific fossil sequence, the predictions are easily falsified. This is also true for the many other pseudosciences that Laudan identifies. If they restricted their predictive "hypotheses" to the natural realm, used valid arguments, accepted empirical evidence to the contrary as valid, and agreed that they would change their views if conflicting evidence was present, all of these pseudosciences would long ago have disappeared because they would have been falsified. The reason they haven't vanished is because their proponents invariably make claims that have supernatural, preternatural, and paranormal elements, and these elements cannot be tested and falsified, so pseudosciences can persist just as Popper claimed (but then he once thought that modern scientific evolutionary theory was unfalsifiable, so what does he know?)."

    • Rudolf Carnap (1891-1970)

    • Billionaire Soros stakes fortune on 'matter of life and death' - defeating George Bush

      • George Soros has donated almost $5bn (£3bn) over the years to help emerging democracies in Eastern Europe recover from the shadow of tyranny. Now he is applying the same principles, and a large chunk of his fortune, to the United States, where he believes the defeat of George Bush in next year's presidential election is "a matter of life and death".

        So far, he has spent more than $15m: two-thirds of it going to a liberal-leaning group called America Coming Together, which intends to mobilise voters in battleground states next November; $3m of it going to a new Washington think-tank run by Bill Clinton's former chief of staff, John Podesta; and $2.5m to the passionately anti-Bush internet lobbying group, to help pay for television advertisements attacking the President.

        Political donations on this scale have precedents. On the right, figures such as Richard Mellon Scaife and Howard Ahmanson have given hundreds of millions of dollars over several decades on political projects both high (setting up the Heritage Foundation think-tank, the driving engine of the Reagan presidency) and low (bankrolling investigations into President Clinton's sexual indiscretions and the suicide of the White House insider Vincent Foster).

        But on the left it is almost unheard of. Mr Soros has given money to political campaigns before - $122,000 in the 2000 elections alone. This, though, is very different. In recent interviews he has likened the with-us-or-against-us rhetoric of the Bush administration to the political language of Nazi Germany. And in a forthcoming book, The Bubble of American Supremacy, he argues that the destructive arrogance of the White House, in Iraq and elsewhere, is like an overheating of the stock market that must and will be corrected.

        The Hungarian born financier and philanthropist describes the Bush administration's policies as a crude form of social Darwinism. "I call it crude because it ignores the role of co-operation in the survival of the fittest, and puts all the emphasis on competition." And he explains why the current administration is so much at odds with the driving ideology of his worldwide Open Society Institute. "The supremacist ideology of the Bush Administration stands in opposition to the principles of an open society, which recognise that people have different views and that nobody is in possession of the ultimate truth," he writes. "When President Bush says, as he does frequently, that freedom will prevail, he means that America will prevail. In a free and open society, people are supposed to decide for themselves what they mean by freedom and democracy, and not simply follow America's lead ... A chasm has opened between America and the rest of the world."

        Unlike other critics who have made casual comparisons between the Bush White House and the Nazis, Mr Soros speaks with some authority - he survived the German occupation of Budapest as a boy.

        That has not deterred prominent Republicans from hooting with indignation, or from accusing him of hypocrisy because Mr Soros has been a champion of campaign finance reform intended to keep big-money donations out of politics. "It's incredibly ironic that George Soros is trying to create a more open society by using an unregulated, under-the-radar-screen, shadowy, soft-money group to do it," the Republican National Committee spokeswoman Christine Iverson said recently. The Washington Post has similar reservations, writing in a recent editorial: "Wasn't the whole point of the new campaign finance law to get big checks of this kind out of politics? Are these huge donations healthy for small-d democracy, not just big-D Democrats?" Mr Soros's response seems to be: I will do whatever it takes, if the result is defeat for President Bush.

        If the Republicans are alarmed, it is partly because the Soros donations are part of a new form of political activism on the left, one that takes advantage of the internet., with its 1.8 million members, has proved it can raise millions of dollars in days for a liberal cause and act as a counterweight to political organisations, including the Democratic Party leadership.



      • Soros calls for 'regime change' in US:



    • What is logical empiricism?
    • Bayesian Modeling in the Social Sciences: an introduction to Markov chain Monte Carlo, materials from 10 hours of lectures given to the ICPSR Summer School on Quantitative Methods, July 1999 PDF (3M) by Simon Jackman

    • On Science, Scientific Method And Evolution Of Scientific Thought: A Philosophy Of Science Perspective Of Quasi-Experimentation

      • "Logical Empiricism
        Essentially, Carnap replaced the concept of verification with the idea of "gradually increasing confirmation" (1953, p. 48). He argued that if verification is taken to mean the "complete and definitive establishment of truth," then universal statements can never be verified. However, they may be "confirmed" by the accumulation of successful empirical tests. Thus, science progresses through the accumulation of multiple confirming instances obtained under a wide variety of circumstances and conditions.
        Logical empiricists believe that all knowledge begins with observation. This leads to empirical generalizations among observable entities. As our ideas progress, theories are formulated deductively to explain the generalizations, and new evidence is required to confirm or disconfirm the theories. Throughout the process, data are given precedence. Indeed, the entire process is viewed as essentially an inductive one. Science in general and knowledge in particular are believed to occur in an upward fashion: from data to theory to understanding (Bagozzi, 1984). Feigl (1970: p. 7) terms this as "an 'upward seepage' of meaning from the observational terms to the theoretical concepts," and it is construed in a similar way by Hempel (1952: p. 36), Carnap (1939: p. 65) and others logical empiricists.
        Logical empiricism is characterized by the inductive statistical method. In this view, science begins with observation, and its theories are ultimately justified by the accumulation of further observations, which provide probabilistic support for its conclusion. Of course, the logical empiricist's use of a probabilistic linkage between the explanans and the explanandum does not avoid the problem of induction. It remains to be shown how a finite number of observations can lead to the logical conclusion that a universal statement is "probably true" (Black, 1967). Moreover, attempts to justify induction on the basis of experience are necessary circular. The argument that induction has worked successfully in the past is itself an inductive argument and cannot be used to support the principle of induction (Chalmers, 1976)."

      • "Popper And Falsificationism
        Unlike positivists, Popper accepted the fact that "observation always presupposes the existence of some system of expectations" (1972: p. 344). For Popper, the scientific process begins when observations clash with existing theories or preconceptions. To solve this scientific problem, a theory is proposed and the logical consequences of the theory (hypotheses) are subjected to rigorous empirical tests. The objective of testing is the refutation of the hypothesis. When a theory's predictions are falsified, it is to be ruthlessly rejected. Those theories that survive falsification are said to be corroborated and tentatively accepted (Anderson, 1983).
        In contrast to the gradually increasing confirmation of induction, falsificationism substitutes the logical necessity of deduction. Popper exploits the fact that a universal hypothesis can be falsified by a single negative instance (Chalmers, 1976). In Popper's approach, if the deductively derived hypotheses are shown to be false, the theory itself is taken to be false. Thus the problem of induction is seemingly avoided by denying that science rests on inductive inference. Anderson (1983) notes that Popper's notion of corroboration itself depends on an inductive inference. According to falsificationism, then, science progresses by a process of "conjectures and refutations" (Popper 1962, p. 46). In this perspective, the objective of science is to solve problems. "

      • Research Traditions
        Like Kuhn and Lakatos, Laudan sees science operating within a conceptual framework that he calls a research tradition (Anderson, 1983). The research tradition consists of a number of specific theories, along with a set of metaphysical and conceptual assumptions that are shared by those scientists who adhere to the tradition. A major function of the research tradition is to provide a set of methodological and philosophical guidelines for the further development of the tradition (Anderson, 1982).
        Following both Kuhn and Popper, Laudan argues that the objective of science is to solve problems -- that is to provide "acceptable answers to interesting questions" (Laudan, 1977, p. 13). On this view, the "truth" or "falsity" of a theory is irrelevant as an appraisal criterion. The key question is whether the theory offers an explanation for problems that arise when we encounter something in the natural or social environment which clashes with our preconceived notions or which is otherwise in need of explanation (Anderson, 1983).

    David Corfield writes:
     > Does either side of this debate have ideas about where it wants to
     > see mathematics going, or the way it is taught?
    Speaking only for myself, I'd like to see mathematics and the
    philosophy of mathematics become more objective and reality-oriented.
    I would hope for the following: (1) better integration of pure and
    applied mathematics and various mathematical sciences; (2) healthy
    skepticism about Bourbakian "mathematical architecture"; (3) greater
    attention to and understanding of f.o.m. issues; (4) more emphasis on
    mathematical rigor, precise thinking, etc.  I feel that all of these
    would have a very positive impact on mathematics education as well as
    philosophy of mathematics.
     > There have been consequences (possibly unintended) of the 'set
     > theory as foundation of math' picture. It has worked its way into
     > the minds of the larger part of Anglo-American philosophers. Are
     > FOM set theorists happy with that, or do they take their work to be
     > misappropriated? If the latter, should they not say so?
    I think one of the bad consequences was the "new math" of the 1960's
    and 1970's.  This was a US educational fad which consisted of teaching
    set-theoretic f.o.m. to young school children in inappropriate ways.
    See Morris Kline's book "Why Johnny Can't Add".
    Though I see a lot of f.o.m. value in set theory, I'm not particularly
    wedded to it.  In an ideal world, I would expect philosophers of
    mathematics to explore objective alternatives to set-theoretic f.o.m.
    In the real world that we live in, I'm alarmed about the subjectivist
    or postmodernist trend in current academic philosophy, and I don't see
    how anything good for mathematics can come of it.
     > David Corfield
     > School of Cultural Studies
     > Leeds Metropolitan University
     > U.K.
     > Interests: Historicist philosophy of mathematics, cognitive psychology
    Hmmm, cultural studies.  Does this smack of the subjectivist turn that
    I alluded to above?  And what about "historicist philosophy"?  Is it
    Hegelian?  I'm not a big fan of Popper, but I think Popper's "The
    Poverty of Historicism" made some good points.  An even better attack
    on historicism (not widely known, unfortunately) is Ludwig von Mises,
    "Theory and History".
    -- Steve
    Name: Stephen G. Simpson
    Position: Professor of Mathematics
    Institution: Penn State University
    Research interest: foundations of mathematics
    More information:

    [Intro - CranClan] . . [Happening Things] . . [The Daresbury Laboratory Web Ring of Life] . . [NCS - Non-Competitive Scrabble] . . [Garden Gnomes of Daresbury Laboratory] . . [Nature and Local UK Things] . . [USA 2001 and LDEO Columbia University] . . [Historical Literature/Poetry] . . [Music] . . [Misc Things] . . [DL SRS Status] . . [Conference and Travel Things] . . [The Wonders of Team Building] . . [Other People's Homepages] . . [Crystallographic Internet Front] . . [While in Melbourne] . . [Semi Relevant Links]
    [Back to Lachlan's Homepage] | [What's New on Lachlan's Homepage] | [Literature and Books page]

    (This Webpage Page in No Frames Mode)

    If you are feeling sociable, my new E-mail address is [address now invalid] (replace the *at* with an @ ) . Old E-mail addresses might be giving forwarding or reliability problems. Please use clear titles in any Email - otherwise messages might accidentally get put in the SPAM list due to large amount of junk Email being received. So, if you don't get an expected reply to any messages, please try again.