Destruction of the Continental School

Stephen J Bensman notsjb at LSU.EDU
Thu Oct 7 16:07:39 EDT 2010


More fun from my book.  Some of you may not like what is written, but it expresses what I consider the truth.  I am a “Tea Party conservative” for what I consider very good reasons, having earned a doctorate in Russian history.   You thankfully won’t get any more, because it becomes highly technical after this but this, I think, many will find interesting.

 

Stephen J. Bensman

LSU Libraries

Louisiana State University

Baton Rouge, LA   70803

USA

notsjb at lsu.edu

 

6. Creating the Stochastic Models: The Melding of British and Continental Statistics 

The State of Statistical and Probabilistic Theory at the Turn of the 20th Century

Salsburg (2001, p. 17) ascribes revolutionary importance to Pearson’s system of skew distributions.  According to him, before Pearson, the “things” that science dealt with were real and palpable such as planets moving in space, blood coursing through veins, and as well as chemical elements and compounds.  However, according to him, Pearson proposed that such observable phenomena are only random reflections and that what is real is the probability distribution.  In Salsburg’s view, the real “things” of science became not things we can observe and hold but mathematical functions describing the randomness of what we can observe.  Therefore, what is necessary to determine in a scientific investigation are the parameters of the distribution.  What Pearson failed to realize, according to Salsburg, was that these parameters could never really be determined but only estimated from the data.  This is somewhat ironic, given Pearson’s philosophical idealism, whereby observable phenomena—such as statistics—are merely external manifestations or mental constructs of Ideas—such as parameters.

            In an invited address to an American Statistical Association annual meeting Neyman (1960) portrayed the history of statistics as a succession of stages in the study of “indeterminism .”   He defined such study as follows:

                 The words “indeterministic study”…designate research aiming to

               determine how frequently a quantity X characterizing the phenomena

               considered assumes its various particular values.  If the purpose of 

               research is to establish the exact value of X as a function of other

               variables, then this research is “deterministic.”  p. 625.

From this perspective Neyman defined the first stage as the period of “marginal indeterminism.”  This was the period symbolized by the names Laplace and Gauss, in which research in science was all deterministic with just one domain, that of errors of measurement, treated indeterministically.  Neyman called the second stage the period of “static indeterminism,” and it covered roughly the end of the 19th century and the beginning of the 20th century.  This period was symbolized by Edgeworth, Galton, Karl Pearson, and others.  Here the main subject of study was a “population,” and efforts were made to develop systems of frequency curves to describe analytically the empirical distributions.  The third discernible stage, according to Neyman, lasted roughly from 1920 to 1940, and could be termed the period of “static indeterministic experimentation.  Its leading figure was R. A. Fisher, and the typical problem considered was whether two given populations have the same distributions of X.  This problem led to the development of tests of statistical hypotheses and estimation. 

            Neyman played a major role in the transition from the second to the third stage, and he considered Pearson’s system of skew curves of the utmost importance.  In a seminal article, in which he systematically employed the term “contagious distribution” for the first time in the English language (Douglas, 1982), Neyman (1939) assessed the importance of the Pearson curves in the following manner:

   …[The Pearson curves]…are very important…because of the empirical 

   fact, that it is but rarely that we find in practice an empirical distribution, 

   which could not be satisfactorily fitted by any of such curves.  Consequently,

   wishing to deduce some test applicable in this or that case, we may usefully

   assume that the basic distribution is one of the Pearson system and, owing

   to the frequently continuous character of the connection between the 

  conditions and the final results, our final formula will be approximately

  valid when applied to the data under consideration.  p. 55.   

However, the Pearson curves were merely descriptive mathematical graduation curves logically requiring a further step.  After stipulating that the theoretical distribution must  satisfactorily fit the empirical data, Neyman set forth this further step by declaring, “But we may legitimately require something else: an ‘explanation’ of the machinery producing the empirical distributions of a given kind” (p. 55).  He put “explanation” in quotation marks to symbolize that mathematics deals with the conceptual sphere that is quite distinct from the perceptual and, therefore, cannot produce a real explanation of phenomena but only “some ‘interpolation formula,’ some system of conceptions and hypotheses, the consequences of which are approximately similar to the observable facts” (p. 55). 

            The “machinery” called for by Neyman was created in a process of melding Continental into British statistics.  This melding was marked by two major achievements.  First, the Lexis Ratio was incorporated into the chi-squared test initially created by Karl Pearson and further developed by R. A. Fisher to determine the goodness of fit of empirical data to a theoretical distribution.  Second, the Poisson process, whose importance was first recognized by Bortkiewicz, was integrated with elements of the Pearson curve system to create stochastic models for the compound and contagious distributions that underlie information science and scientometrics.  Together these two achievements enable one to posit and identify the stochastic processes operative in these fields.  

The melding of Continental statistics into British statistics was facilitated by the devastation of the former resulting from the rise of murderous totalitarian regimes first in Russia and then in Germany.  Žarković (1956) described what happened in Russia thus:

               …political considerations became an increasingly pronounced factor

               in the development of Russia’s statistics.  This brought about the

               gradual disappearance of the use of theory in the practical activity

               of the statistical administration.  In the late thirties Vestnik statistiki began

               to close its pages to papers in which statistical problems were dealt with

               mathematically. At the end of that period they disappeared completely and

               have not appeared since.  The result of this new trend was that statisticians

               abandoned practice to continue their work at the universities and other scientific

               institutions where they pursued statistics under the name of some other 

               subject.  Officially, Romanovsky, Kolmogorov, Smirnov and many others

               are mathematicians divorced from statistics…. According to official views, 

                statistics became an instrument for planning the national economy.  Consequently,

                its basis is the Marx-Lenin political economy; it represents a social science or,

               in other works, a class science.  The law of large numbers, the idea of random

               deviations, and everything else belonging to the mathematical theory of 

               statistics were swept away as the constituent elements of the false universal theory

               of statistical science….  p. 338.   

As orthodox Marxist theory descended on Soviet statistics, the numbers generated by the Central Statistical Administration and its successors became more and more suspect.  Murray Feshbach, the expert on Soviet demography at the US Census Bureau, stated that Soviets held back census and other data, because the results were so implicitly negative to the Russian leadership (Murphy, 1983, p. 51).  Feshbach was one the first to uncover Soviet demographic problems in the form of high infant mortality and declining life expectancies, and his analyses were eagerly sought by the Russians themselves.   In a recent evaluation of the current situation Feshbach (2008: see URL below) expressed the opinion that the Russian tragedy is 

http://www.washingtonpost.com/wp-dyn/content/article/2008/10/03/AR2008100301976.html

happening inexorably, that Russian society may be actually weaker than it was in Soviet times, and that the decades since the breakup of the Soviet Union have witnessed an appalling deterioration in the health of the Russian population.  The damage done by Communism may have been permanent.  The Bolshevik attack on statistics did not stop with the economy.  Statistics had largely developed as the handmaiden of genetics, and Stalin embraced Trofim Lysenko’s anti-Mendelian theories on the inheritability of acquired characteristics.  In 1940 Lysenko became director of the Institute of Genetics, and, as Salsburg (2001, p. 149) reports, biologists trying to follow R. A. Fisher’s work in mathematical genetics were discouraged or even sent to prison.  One of Lysenko’s victims of was the noted botanist and geneticist, Nikolai Vavilov, who had collaborated with the British Mendelian, Bateman.  Vavilov was arrested in 1940 and died of malnutrition in the camps in 1943.

            Genetics were also instrumental in the destruction of Continental statistics in Germany, where Nazi race theory can be described as eugenics on steroids.  In a section entitled with the Spanish Falangist battle cry “Viva la muerte!” to symbolize the rampant anti-intellectualism overtaking Continental Europe—Spain being the initial battleground between the twin horrors of Bolshevism and Nazism—Salsburg (2001, p. 88) points out that Hitler’s racial policies decimated German universities, because many of the great European mathematicians were Jewish, of Jewish descent, or were married to Jews, and most of those with no Jewish connection were opposed to Nazi plans.  The litany of great statisticians of the German school—including  in this school, as Keynes did, not only Germans but other nationalities who wrote in German and were in habitual contact with the German scientific world—leaving Europe to take refuge primarily in the “Anglo-Saxon” United States is quite impressive.  One of these was Richard Von Mises, who laid the modern foundations of the frequency theory of probability.  According to his collaborator and wife, Geiringer (1978), in 1933 Von Mises recognized that it would be “both unwise and undignified” (p. 1229) to remain at the University of Berlin and accepted a position at the University of Istanbul.  In 1939, as World War II approached, he felt that he had to leave Istanbul and accepted a position at Harvard University.  The man certainly could calculate his probabilities.  Two of the German school fleeing Europe were instrumental in the final formulation and understanding of the stochastic models underlying scientometric and information science distributions.  One was George Pólya, who pioneered contagious distributions.  It was from Pólya’s work that Neyman (1939, p. 36) took the term “contagious.”  Pólya did his seminal work at the Swiss Federal Institute of Technology in Zürich, but in 1940 the political situation in Europe forced him to move to the United States, where he ultimately took up a position at Stanford.  The other was William Feller, who was fired at the University of Kiel in 1933 because of his “mixed descent.”  Feller made it to the US in 1939, where he worked first at Brown, then at Cornell, and finally at Princeton, winning in 1969 the prestigious National Medal of Science for his work in mathematics and statistics.  His textbook, An Introduction to Probability Theory and Its Applications, became a standard that is still highly consulted today.  Others of the German school finding refuges in the US were Emil Gumbel, Herman Hartley, and Abraham Wald.  Gumbel was chased from the University of Heidelberg in 1932 by Nazi-led student groups, who thereby saved his life, fleeing first to France and then to New York, where he did his best work as an adjunct professor at Columbia University.  Together with R. A. Fisher, he pioneered the mathematical field of extreme value theory.  As for Hartley, in 1934 he emigrated first to England, where he became a close collaborator with Egon Pearson, Karl’s son, and then in 1953 to the US, working first at Iowa State, which Snedecor had made a center of British biometrics, and then at Texas A&M.  Wald was forced to flee Austria in 1938 at the time of the Anschluss, joining in 1941the faculty of Columbia University, where he paid his tormentors back by doing statistical studies that enabled the US bomber crews to level German cities more safely because of the greater effectiveness of their armor protection.  The arrival of such intellectual giants was instrumental in the rise of the United States to scientific preeminence.      

     

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.asis.org/pipermail/sigmetrics/attachments/20101007/3180613f/attachment.html>


More information about the SIGMETRICS mailing list