For the 40 years that I have been involved in radiation biology, I have been told that the discipline, indeed almost the entire spectrum of radiation sciences, is disappearing and that radiation researchers themselves are a “dying breed”. On a personal level, the evolution over that time has been a transition from “they say” to “I say”, as the ever worsening crisis has become evident from both formal (1–3) and informal studies.2 Indeed, positioning the Radiation Research Society to better face this situation was a key platform during my time as President. I am obviously not alone in this concern, with many making stalwart and repeated efforts to identify the underlying causes with the hope of mitigating the decline. In general, their conclusions have included both radiation-specific factors [e.g., loss of specialized training programs (4, 5); significant reductions in career opportunities in radiation biology and related disciplines (5); competition from and/or a lack of appreciation by other sciences/scientists (4, 5), etc.], as well as other influences that can be applied currently across a plethora of disciplines [e.g., the overall reduction in funding opportunities (6, 7); a reduction in the public respect for scientists and science, etc.]. However, little has seemed to stem the tide so, at this time, I beg your collective indulgence to offer a reflection on this topic, with a focus on what I believe is a previously unacknowledged issue. My observation comes about as a result of participation on multiple review panels, in combination with my specific experience over the past two years when, through three iterations, I have been (unsuccessfully) submitting a P01 proposal. Despite the limited number of criteria needed for a successful grant application (8), the overall low level of success suffered by so many of us led me to contemplate the role that our members may play in peer review panels, with the realization that we, as radiation researchers, may be living in a “crab bucket”.
So, what is a crab bucket? I first became aware of this concept while reading one of my favorite authors, Terry Pratchett (9), although a friend remarked recently that the idea may have originated in India with the penchant of the British Raj for seafood. Whatever its origins, the essence of the crab bucket metaphor is this: you can carry a large number of crabs in an open bucket without fear of any escaping. This is because as fast as any one crab attempts to climb out, the others drag it back down; the group as a whole prevents the progress of any single individual. Although certainly not a phenomenon peculiar to radiation biology alone, personal observation indicates that our discipline is particularly prone to this mentality when faced with scoring a peer's work. As a means of comparison, while participating in reviews of other small scientific niches, I have seen “blind” support of a peer's research; scores of 1s across the board, irrespective of the quality of the research, with the subject expert taking on the role of champion and brooking no dissent from the rest of the panel. Such support may be considered psychologically as a “rising tide” mentality, made in anticipation that if a peer in the same subspecialty is successful, more money will follow. Of course, alternatively, this might reflect a subconscious hope for a quid pro quo (10) or offer evidence of cronyism (11) since, in many of the smaller research areas, it is relatively easy to identify one's peer on a review panel. However, in contrast, when considering more recent radiation biology reviews, I have seen the reverse occur, with the “expert” initiating or supporting a veritable blood-bath. If this perception is true, the question arises as to why some might feel the need to treat a peer's work so harshly?
One explanation is that, as the number of participants in our field has declined, there has been a parallel fall in the number of available experts to perform reviews. Although this could have resulted in the cronyism described above, when available NIH funding is also on the decline (12, 13), the combination has likely resulted in an excessive feeling of competition (14), so that the grant evaluation process then becomes an extension of the perceived hierarchical relationship between the assigned reviewer and the applicant (14, 15). Interestingly, while discussing this topic with others, a source at the NIH3 commented that when described as the “subject matter expert”, radiation biologists seem inclined to demonstrate their expertise to the rest of the panel by providing excessive criticism, rather than championing the application. A number of studies have shown that such actions have a detrimental effect on funding potential, especially when a review panel includes a high number of non-experts or where there is disagreement regarding the quality of the submission (16); on a personal note, being older (16) and/or female (17) further, and significantly, increases the funding disadvantage of the applicant.
Of course, some readers may dismiss these observations as an example of sour grapes since it provides an excuse for my failure by blaming others. However, I firmly believe that there is something rotten in the state of radiation biology4 and, at a time when the workforce of radiation research professionals persists in its decline and funding resources continue to shrink, the often onerous role of reviewer becomes of paramount importance. Therefore, I would ask that all who accept this role consider the possible existence of subtle, subconscious biases that may affect your view of your fellow scientists' work. As a counterbalance to these potential biases, I offer some additional criteria that might be added to your personal review criteria:
When acting as the subject expert, focus on feasibility, appropriate methodologies and techniques; importantly, overcome personal disagreements with proposed hypotheses, use of alternative models (unless obviously inappropriate), etc. This is especially important if you are a senior researcher. Try and take a broader view; indeed, disagreements with radiation dogma should be supported if soundly argued – how else can we move our field forward?
If you are relatively junior when asked to be a reviewer, assess your assignments as you would wish yours to be assessed. There is no need to demonstrate your expertise – to be on the panel, you must already be considered an expert;
Although cognizant of the amount of work involved, do not simply focus on weaknesses, but make every effort to provide feedback on how to improve the grant. Importantly, strongly encourage the inclusion of a qualified radiation worker on any investigative team; not only will the grant benefit, but the importance of a knowledgeable radiation scientist will be emphasized and may, in one small way, ensure the continuation of our profession.
I am certainly not advocating that an application's merit is artificially inflated, but that adequate weight is given to the positive aspects of a proposed investigation. Above all, be kind to your fellow radiation researchers; after all, it only takes one crab to prevent an entire bucket from escaping.
ACKNOWLEDGEMENTS
I would like to thank those of you who took the time to read this commentary and provide me with feedback. In deference to your collective positions in the field and the Radiation Research Society, I will allow you all to remain anonymous.
REFERENCES
Notes
[1] 2 The author is Co-Chair of the National Council for Radiation Protection and Measurement (NCRP) Council Committee 2 on Meeting the Needs of the Nation for Radiation Protection.