|Year : 2016 | Volume
| Issue : 3 | Page : 291-293
Modern research methodology, instant cricket, and weighing the pig
Department of Community Medicine, Dr. D. Y. Patil Medical College, Hospital and Research Centre, Dr. D. Y. Patil Vidyapeeth, Pune, Maharashtra, India
|Date of Web Publication||17-May-2016|
Department of Community Medicine, Dr. D. Y. Patil Medical College, Hospital and Research Centre, Dr. D. Y. Patil Vidyapeeth, Pune, Maharashtra
Source of Support: None, Conflict of Interest: None
|How to cite this article:|
Banerjee A. Modern research methodology, instant cricket, and weighing the pig. Med J DY Patil Univ 2016;9:291-3
Modern research methodology has contributed a lot to the rational practice of medicine. Much of it is based on the application of biostatistics, particularly probability theory in finding the "best evidence" for a cause-effect relationship. Over the past decades, this dependence on biostatistics and probability theory has led to overdependence and evolution of a new paradigm in medical decision making - evidence based medicine (EBM). However, with overuse like with all good things the law of diminishing returns applies to EBM.
EBM is always a double-edged sword. Wrong hands use one edge while the right hands use the other edge, each to suit their respective goals. EBM provides a gold standard for judging the efficacy of new interventions and based on a hierarchy of study designs-to establish cause-effect relationships. On the downside, it has some disadvantages. One of them is the snobbish exclusivity by few proponents of EBM of all types of nonstatistical inputs in medical decision making.  This trend in the long run will impoverish medical science which is not an exact science but is heavily dependent on judgment, intuition and the accumulated experience of experts. These finer nuances of the art of medicine are being lost in the increasing influence of EBM.
In the clutter of statistical outputs what is overlooked is that probability theory which is the foundation of EBM has inherent limitations as the sole measure of uncertainty.  Modern application of statistical theory in decision making in the face of uncertainty has involved oversimplifications in our endeavor to fit in the statistical models to our specific needs. This simplification has a limit. The maxim "make everything as simple as possible but not simpler" attributed to Einstein  cautions against oversimplification. The fallacy of adapting oversimplified statistical models to guide medical decision making and policy has led to many paradoxical situations. The narrow and mechanical application of borrowed statistical techniques, in a bid to attain absolute certainty has adversely affected creativity and inspiration.  Medical journals are full of stand-alone scientific papers with "statistically significant" results that are rarely cumulative and contribute little to the understanding to the bigger puzzle of disease processes. Often, these studies cannot be replicated, and it is likely the research claim is false rather than true. 
Causation, as assumed by EBM, is rarely straightforward and amenable to statistical models. To accommodate causation into these models, its complexities are often glossed over or sometimes downright ignored.  We are an impatient generation which is reflected in our sports as well as our science. Take the popular sport-cricket. Earlier generations cherished test matches even if quite a few of them ended in a draw. Subsequent generations preferred 1 day matches and forms of instant cricket which rarely end in a draw. The most popular form of cricket today is the Indian Premier League (IPL) which besides yielding results in real time, generates billion of dollars in revenue.
Similarly, for present day researchers, uncertainty in research results akin to a draw in sports is unacceptable. The goal is to eliminate uncertainly at all costs and the tool used to do so is statistical significance. Achieving statistical significance at the traditional 0.05 level has become a weighing scale to eliminate uncertainty. A statistically significant result is assumed to mean that the hypothesis being tested is true. The issue is deemed settled rather than letting ambiguity linger and exploring the research question further. Akin to IPL the outcome is forced in quick time. This practice overlooks the complexities which lead to biases in establishing cause-effect relationships in human research. The complexities arise because of difficulty in getting a truly representative sample for research, inherent biological variability, difficult to measure variables and confounders. All these complexities are ignored willfully in an endeavor to fit statistical models to eliminate uncertainty.
Three types of biases creep into research studies. These are selection bias, measurement bias, and confounding bias.
Volunteers in clinical trials even in the highest rated randomized controlled trials (RCTs) are a select lot. These trials are usually undertaken at tertiary care centers. Stringent ethical regulations, including explicit informed consents ipso facto limit these trials to participants who volunteer who may differ from those who refuse to participate. In such a scenario, selection biases cannot be fully ruled out. This can limit external validity or generalization of the results of a RCT.
Biological variables can be measured with varying degrees of accuracy. Some are easy to measure while others are difficult and elusive to quantify. Variables such as height, weight, blood pressure, and hemoglobin levels can be measured easily with negligible errors for research purposes. On the other hand, certain variables such as attitudes, behavior, and mental states are more difficult to measure with reasonable consistency and accuracy. To overcome these problems, certain scales may be used. Development of scales to ascertain difficult to measure variables may involve a lot of effort and subsequently have to be validated in different settings. Alternatively, surrogate measures may be used with their own limitations. All these lead to varying degrees of measurement errors. "Case definition" which depends on proper measurement and classification can be difficult in such situations.
Even the widely used pain scores have been found to lead to wrong metrics. Pain score metrics have failed to distinguish between the severity and response to painkillers for the relief of acute and chronic pain. 
The most difficult bias to eliminate in research studies is confounding bias. Confounders are ubiquitous - some known - most unknown. A significant correlation between two variables can be due to the presence of some third uncontrolled or unknown variable. Such confounding can distort the interpretation of study results.
The simplistic approach in EBM due to preoccupation with statistical inferences while ignoring vital issues such as selection bias, measurement errors due to instrument or observer biases, case definition, and confounders is illustrated by the example of "Weighing the Pig" in [Figure 1]. John Burns as mentioned by Abramson and Abramson  described a method of "Weighing the Pig" as follows: Find a plank that is absolutely straight, balance it at dead centre so that it is absolutely level, place a pig on one end, and pile stones on the other end until the plank is exactly level again. Then carefully guess the weight of the stones. This gives the weight of the pig!! This analogy has been given by Abramson and Abramson to highlight the inaccuracies inherent in methods of most cost benefit analysis studies since the collection of adequate data and the conversion of benefits to monetary unit present considerable theoretical and practical difficulty. The same applies to most studies based on the methods of EBM.
[Figure 1] illustrate the present day emphasis on biostatistics (the plank in the figure that is absolutely straight and balanced at dead centre so that it is absolutely level - represents the most perfect statistical techniques), the stones on the left represent the vital issues such as selection bias, case definition, measurement errors and confounders, which are glossed over in present day EBM while inferring the truth in the hypothesis represented by the pig on the right.
EBM as practiced today has led to skepticism as to its usefulness. Views have been expressed about its real world failures. 
An excellent "counter-culture" to EBM putting back emphasis on the art of medicine has been put forth by Mukerjee.  The three laws proposed by him and applicable to the uncertain science of medicine are as follows: Law one - A strong intuition is much more powerful than a weak test; law two - "Normals" teach us rules while "Outliers" teach us laws; law three - for every perfect medical experiment, there is a perfect human bias. These laws should caution us against relying solely on the methods of EBM.
Clinicians in their day-to-day practice face a lot of ambiguity. They have little understanding of statistical techniques but have to rely heavily on their intuition. Researchers, on the other hand, consider their collected data and statistical models as objective measures of reality.  Both these approaches have limitations. The way forward is some kind of reconciliation between the two approaches to uncertainty in medicine that is more integrative. Probability is a fine balancing act (weighing the pig!!). We should not overlook the infinite complexity of the individual case. On the other hand, we have to resort to some simplification to achieve useful generalization by ignoring much of the individual differences. Perhaps, we have to encourage co-existence of both philosophies to ensure a science to the art of medicine.
The author acknowledges Professor J. H. Abramson, Emeritus Professor of Social Medicine, The Hebrew University-Hadassah School of Public Health and Community Medicine, Jerusalem from whose book the context of "Weighing the Pig" has been borrowed. Professor Abramson was also kind enough to go through this editorial and approve it before its publication.
| References|| |
Isaacs D, Fitzgerald D. Seven alternatives to evidence based medicine. BMJ 1999;319:1618.
Weisberg HI. Willful Ignorance - The Mismeasure of Uncertainty. New Jersey: John Wiley & Sons, Inc.; 2014.
Ioannidis JP. Why most published research findings are false. PLoS Med 2005;2:e124.
Ballantyne JC, Sullivan MD. Intensity of chronic pain - The wrong metric? N Engl J Med 2015;373:2098-9.
Abramson JH, Abramson ZH. Research Methods in Community Medicine. 6 th
ed. West Sussex: John Wiley & Sons, Ltd.; 2008. p. 59.
Miller CG, Miller DW. The real world failure of evidence-based medicine. Int J Pers Cent Med 2011;1:295-300.
Mukerjee S. The Laws of Medicine - Field Notes from an Uncertain Science. London: Simon & Schuster UK Ltd.; 2015.