by Robert L. Park
Park charges Hagelin and the studys co-authors with pseudoscience, but this accusation is based on a critique that is superficial, highly polemical, and seriously flawed. His article contains not a single statistic and betrays no evidence that he ever read the study.
Parks book urges protection against promulgation of flawed scientific ideas by subjecting them to proper experimental testing: The success and credibility of science are anchored in the willingness of scientists to . . . (1) expose new ideas and results to independent testing and replication . . . (2) abandon or modify accepted facts or theories in the light of more complete or reliable experimental evidence.
Yet Park appears to feel the hypothesis of Hagelins study is ridiculous on its face, and that no serious investigation of the claim is necessary. In the context of dismissing Hagelins study, Park quotes H.L. Mencken: The most common of all follies is to believe in the palpably untrue. In other words, Park maintains that it should have been obvious at the outset that the theory being tested was false, so why bother to examine the evidence? Apparently he believes that ideas that do not tally with the current scientific paradigm can be written off without serious consideration. This is remarkable, given Parks advocacy of scientific objectivity and careful scrutiny of evidence. Given the laudable aim of promoting truth in science, one would hope the book itself would be a fine example of the honesty, open-mindedness, and careful evaluation of evidence that the author so strongly advocates.
The problem here is one that arises whenever science is confronted with novel or unconventional ideas. The history of science shows that new theories are often controversial and initially fall outside the realm of what is considered mainstream science. Theories are profoundly new only when they introduce new ways of thinking. If the theoretical ideas in question are already orthodox, they are unlikely to have a revolutionary impact. The great advances of twentieth century physics quantum mechanics and general relativity theory were regarded with great skepticism by physicists for many decades. Einstein received his Nobel prize, not for any of his contributions to these areas, but for his explanation of Brownian motion, a phenomenon which fit within the old paradigm of classical mechanics. Marconi, the discoverer of radio waves, and Tesla, who pioneered the development of electrical devices, were regarded as crackpots by their contemporaries. The list goes on and on. But Park states that: When better information is available, science textbooks are rewritten with barely a backward glance. If only the upward and onward march was so smooth! As Thomas Kuhn pointed out thirty years ago, scientists have prejudices like the rest of us, as Park unfortunately shows.
What chance is there that groups practicing Transcendental Meditation can reduce violent crime? If Park had been interested in a serious scientific discussion of this question, he should certainly have mentioned that:
Parks failure to mention any of these facts in his book can be attributed to only two factors: ignorance or bias neither of which is acceptable from a responsible scientist, particularly one who takes a strong stand.
In his book, Park puts forward the specious claim that the D.C. experiment was a failure. Yet total violent crimes not only decreased during the 8 weeks of the experiment, but also closely tracked the rise in the number of participating TM meditators, as predicted. The 23% drop in violent crime was confirmed to be statistically significant using time series analysis: the probability that the decrease was due to chance is less than two in a billion.
Park rejects out of hand statistical modeling of the data using time series analysis, because to do otherwise would be to give credence to this main scientific finding of the study. Time series analysis is a sophisticated statistical tool for investigating whether factors other than the presumed causal variable might account for the results. Hagelins study used time series analysis to rule out a long list of alternative explanations, including weather variables, seasonal effects, changes in police surveillance, and trends and cyclical patterns inherent in the crime data. If this had not been done, then Park would have had concrete grounds for attacking the research methodology.
Instead, Park merely lampoons Hagelins study, purporting that the use of time series analysis was technobabble, only meant to give the appearance of science. Parks only allusion to the overall study finding is to complain about the use of time series analysis, which is clearly the correct statistical tool in this type of study. He waves away the evidence and state-of-the-art statistical analysis, proclaiming that It was a clinic in data distortion, with no supporting data or analysis for that assertion, and makes not another comment about it.
He concludes that: This was pseudoscience . . . which is not to say that those involved were not sincere in their belief. They may have believed so fervently that they felt a responsibility to make the facts support their belief [emphasis in the original]. In other words, Park makes the unfounded claim that the researchers falsified evidence evidence supplied by the Washington, D.C., police. With high-handed condescension he acknowledges the sincere belief of the researchers, while making the most serious charge of scientific misconduct. These statements amount to a charge of scientific fraud. His statements are an insult to the integrity of the researchers, the Project Review Board, the editors and reviewers of Social Indicators Research, and the District of Columbia Metropolitan Police Departments statistician, who provided the FBI crime data and co-authored the study.
Parks objection to our use of time series analysis is not based on any scientific argument, but merely echoes the comments of a reporter regarding the use of time series analysis to predict levels of violent crime: How could you know what the rates would have been? But, there is no mystery here. Violent crime levels are predictable on the basis of 1) prior crime trends, and 2) temperature a fact that is well known among criminologists, and that was clearly explained at the press conference that Park attended, as well as in the published paper.
This shows up clearly in a graph of the Washington data over the five years prior to the experiment (1988-1992). When average levels of temperature and average levels of homicides, rapes and assaults are plotted over weeks of the year, the crime and temperature curves are right on top of each other, if the vertical axis scales are appropriately chosen. This shows that violent crime levels were directly proportional to temperature and that violent crime can be accurately predicted from the temperature and previous crime data. The same phenomenon occurs in the first months of 1993, but then in the middle of the experimental period (when the meditating group was approaching its maximum size) the violent crime curve drops well below the predicted curve and stays low for several weeks. In other words, during the experiment in 1993, a drop in violent crime was clearly evident in the raw data, even without using time series analysis.
Park objects to our calculation of how much violent crime dropped. Firstly, this calculation was an adjunct step performed after the time series analysis, and therefore challenging it does not contradict our main result: A reduction in violent crime is evident in the raw data, before any statistical analysis. Therefore, our main finding is unassailable. Second, the amount of reduction is calculable because, as we have seen, violent crime levels are predictable on the basis of temperature. Moreover, as our published paper clearly demonstrates, and as Park should have known, the calculated drop in violent crime is extremely robust, and not at all sensitive to the assumptions of the statistical model used.
However, Robert Park abstains from any serious discussion of the data and gives no consideration whatsoever to the appropriateness of the statistical methodology used to analyze it. And despite Parks emphasis on the importance of scientific replication of experimental findings, he neglects to mention that the Washington crime experiment was consistent with 41 previous studies of the effects of Transcendental Meditation on social quality-of-life variables.
In spite of the evidence, Park asserts that levels of violence actually increased to record levels. He confuses homicides which accounted for only 3% of violent crime in Washington during 1993 with violent crimes in general. Even in this instance he omits all data and relies on emotional and inaccurate characterization. Park asserts that the murder rate soared during the experiment, and claims that participants in the project seemed serenely unaware of the mounting carnage around them.
It is true the murder rate did not drop during the course as we acknowledged in the initial research report and in the published study but the facts were very different from Parks characterization. For six weeks ending the month before the experiment, from mid-March through April, homicides in Washington averaged ten per week. Beginning one week after the course and for twelve weeks thereafter, homicides also averaged ten per week. During the eight weeks of the experiment, in June and July, the average was again ten per week except for one horrific 36-hour period in which ten people died. Apart from this brief episode, which was a statistical outlier, the level of homicides during June and July of 1993 was not significantly higher than during the remainder of the year.
According to his article, Park apparently took his lead on the murder issue from a Washington Post reporter who had been impressed that this one 36-hour period had led to a doubling of the murder rate that week. The reporter, and Park, did not notice that the very next week the murder rate dropped from its average rate of ten by more than twice that is, the totals went up to 20 one week and down to 4 the next. This is precisely the type of random fluctuation one must account for when total numbers are small. The average incidence of murder in Washington was little more than one per day, and with numbers as low as this, as Park and all scientists know, random fluctuations can appear extremely high when listed as percentages.
Another type of violent crime with low incidence is rape, yet Park makes no mention of this, perhaps because during the two months of the experiment, rapes decreased by 58%. If Park were interested in an accurate presentation, he should surely have balanced his statistic-free assertion of a murder wave with this arresting fact. The most comprehensive measure of deliberate violence, of course, includes assaults (the most common aspect of violent crime, which accounted for 92% of the studys outcome variable) along with rapes and murders-which together declined by 23%. Parks brouhaha about the murder rate is to distract the readers attention from the main issue: whether a group of people practicing the Transcendental Meditation program achieved a reduction in violent crime.
Park bemoans that simplistic arguments and homespun humor are more effective in a debate than citing the laws of thermodynamics (p. 42-43.) But, as Professor Park has discovered, ridicule makes good theatre, if not good science. And popular writing, not scientific writing, sells popular books.
Unfortunately, the precepts of good science that Park exhorts others to follow are sometimes violated in his own treatment of his material, as shown in the example of Hagelins research. The book contains no footnotes that would allow an inquiring reader to check on the facts that the author presents. Moreover, in some places the book is so one-sided in its evaluation of evidence that it falls into the very kinds of errors in reasoning that it decries. In setting out to be a popular book, the author adopts the language of persuasion to demonize his targets; it is not written in the even-handed style of scientific writing. Indeed, the presentation style in his book is a model of how not to conduct a careful, dispassionate evaluation of scientific issues.
Parks unsupported charges represent a gross attempt to mislead the reader, which in itself is highly unethical from a scientific standpoint. This may be surprising given the authors credentials: former chairman of the Department of Physics at the University of Maryland, and director of the Washington office of the American Physical Society. Given the laudable aim of promoting truth in science, one would hope the book would itself be a fine example of the honesty, open-mindedness, and careful evaluation of evidence that the author so strongly advocates.
When scientists fail to evaluate evidence of scientific studies on their merits, they mislead the public about science. As he is a professor at a major university, most lay readers would be likely to take Parks expert opinions at face value. In this regard, his willfully misleading statements are highly irresponsible. Ironically, Park set out to expose deliberate attempts of scientists to mislead non-experts. In attempting to label Hagelins research as an example of scientific misconduct, his book merely provides a further example of such scientific deception of an unfortunately common type: misguided attacks on novel scientific theories based on blind disregard of evidence.
Hagelin, J.S., Rainforth, M.V., Orme-Johnson, D.W., Cavanaugh, K.L., Alexander, C.N., Shatkin, S.F., Davies, J.L., Hughes, A.O., and Ross, E. 1999. Effects of group practice of the Transcendental Meditation program on preventing violent crime in Washington, D.C.: Results of the National Demonstration Project, June-July, 1993. Social Indicators Research, 47(2): 153-201.
 The analysis reported in the published paper ruled out the possibility that this could have been due to seasonal effects, because a significant reduction during the summer in violent crime levels compared to the expected levels did not occur during any of the five years prior to the experiment.
 After removing the outlier of June 22, Poisson regression analysis indicated there was no significant difference in the level of homicides in June and July 1993 from the remainder of the year.