By Dr. Jay Lehr May 23rd, 2022 | Science | 23 Comments @ CFACT Part One, Part two
EPA regulations rely on environmental epidemiological literature, without applying rigorous tests for reproducibility, and without considering the environmental epidemiology discipline’s general refusal to take account of the need for Multiple Testing and Multiple Modeling. Such rigorous tests are needed not least because earlier generations of environmental epidemiologists have already identified the low hanging fruit.
These include massive statistical correlations between risk factors and health outcomes such as the connection between smoking and lung cancer. Modern environmental epidemiologists habitually seek out small but significant risk factors and health outcome associations. These practices render their research susceptible to false positives as real results. They risk mistaking an improperly controlled co-variable for a positive association.
Environmental epidemiologists are aware of these difficulties, but regardless have made their discipline into exercises in applied statistics. They do little to control for bias, p-hacking and other well known statistical errors. The intellectual leaders of their discipline have positively counseled against taking measures to avoid these pitfalls. But environmental epidemiologists, and the bureaucrats who depend on their work to support regulations, proceed as a field with unwarranted self-confidence. They have an insufficient sense of the need for humble awareness of how much statistics remains an exercise in measuring uncertainty rather than establishing certainty. Their results, do not possess an adequate scientific foundation. Their so-called “facts” are built on Shifting Sands, not on the solid rock of transparent, and critically reviewed scientific inquiry.
A NAS study showed how one particular set of statistical techniques simply counting and p-value plots, can provide a severe test for environmental epidemiology. Meta analyses must be used to detect p-hacking and other frailties in the underlying scholarly literature. We have used these techniques to demonstrate that meta-analyses associating PM 2.5 and other air quality components with mortality, heart attacks and asthma attacks fail this severe test.
The NAS study also demonstrates negligence on the part of both environmental epidemiologists and the EPA. The discipline of environmental epidemiology has failed to adopt a simple statistical procedure to test their research. The EPA failed to require that research justifying regulation be subjected to such a test. These persistent failures undercut confidence in their professional capacities as researchers and as regulators.
Both environmental epidemiology as a discipline, including journals , foundations and tenure committees and the EPA must adopt a range of reforms to improve the necessary reproducibility of their research. However, NAS directs its recommendations to the EPA and more broadly to federal regulatory and granting agencies.
They have reluctantly come to the conclusion that scientists will not change their practices unless the federal government credibly warns them it will withhold government grant dollars until they adopt stringent reproducibility reforms. NAS has also come to the conclusion that federal regulators will not adopt stringent new tests of science underlying regulation unless they are explicitly required to do so.
The National Association of Scholars recommend the following eleven actions be taken by the EPA in order to bring their methodologies up to the level of Best Available Science which is mandated in The Information Quality Act of 2019.
1- The EPA should adopt resampling methods as part of its standard battery of tests applied to environmental epidemiology research.
2- The EPA should rely for regulation exclusively on meta-analyses that use tests to take account of endemic questionable research procedures, p-hacking and HARKing.
3- The EPA should redo its assessment of base studies more broadly to take account of endemic questionable research procedures, p-hacking and HARKing.
4- The EPA should require preregistration and registered reports of all research that informs regulation.
5- The EPA should also require public access to all research data used to justify regulation.
6- The EPA should consider the more radical reform of funding data set building, and data set analysis separately.
7- The EPA should place greater weight on reproduced research.
8- The EPA should constrain the use of “weight of evidence” to take account of the irreproducibility crisis.
9- The EPA should report the proportion of positive results to negative results in the research it funds.
10- The EPA should not rely on research claims of other organizations until these organizations adopt sound statistical practices.
11- The EPA should increase funding to investigate direct causal biological links between substances and health outcomes.
NAS has used the phrase “irreproducible crisis” throughout this essay, and they note that distinguished meta-researchers prefer to regard the current state of affairs as a challenge rather than a crisis.
You do not need to believe it to be a crisis. These current scientific practices are simply not the best available science. We should use the best scientific practices simply because they are the best scientific practices. Mediocrity ought not be acceptable.
If this is the first article you have read in this series please go back to the past two weeks at cfact.org to read the even more extensive parts 1 and 2 or click on my name at the beginning of this article and all my previous article titles will pop up on a list. Click on any title and the full article will appear.
There is no doubt that all CFACT readers question many EPA regulations. After you read this series of articles extracted from the National Association of Scholars booklet, SHIFTING SANDS, you will question even more.
Note: Portions of this essay were excerpted from the book Shifting Sands with permission of the National Association of Scholars (NAS) and its authors Peter Wood, Stanley Young, Warren Kindzierski, and David Randall.
No comments:
Post a Comment