As most of you are aware, Department of Energy has completed its first year of Comparative Review for university grants, which affected approximately one-third of the US HEP university groups. The remaining groups will go through the new precess over the next two years. The move to a comparative review process was recommended by several advisory committees. The new process utilizes panel reviews and is a substantial departure from prior DOE HEP review procedures. The first round resulted in substantial changes in funding for several university groups, including new support for a number of junior faculty members and a loss of support for a number university PIs and senior research scientists.
The comparative review has led to considerable discussion within the HEP community. Since this new review process is likely to have substantial impacts on most US university groups, the DPF Executive Committee has appointed a committee to collect information about the process from both DOE and the community, to report to the community our findings, and perhaps to make recommendations to the DOE with the goal of helping to improve the comparative review process. We submitted the following questions to DOE.
With this web page we are soliciting input from the HEP community. Public comments can be posted on this page, or private comments can be submitted which the committee will keep confidential. Individuals are also welcome to send personal emails or to speak in person to any of the committee members listed below.
We expect to write a short report which will be sent to DOE and distributed to the DPF membership by the end of the summer.
We look forward to hearing from you.
Sincerely,
Marj Corcoran and John Cumalat, co-Chairs
for the DPF Committee on DOE Comparative Reviews
Committee Members:
Marj Corcoran and John Cumalat, co-chairs
Chip Brock
Michael Dine
Paul Grannis
Klaus Honscheid
Jack Ritchie
Kate Scholberg
Stew Smith
Rick van Kooten
Mike Witherell
My biggest concern in the process was the fact that any group which has an integrated program that spans more than one of the frontiers is at a big disadvantage. Things are broken up and while the panel and readers may be able to look at other parts of a proposal, they typically will not have the time. So a group that leverages expertise in one area (say, energy frontier) to do work in another area (say, the intensity frontier) to provide a “whole that is greater than the sum of its parts” will not be reviewed in that way. I think this is very unfortunate—such cross-pollination of ideas even within a group can make very valuable contributions.
Breaking up the grants also makes them harder to manage. The ability to pool resources and to shift support from one project to another was a definite plus. By creating artificial barriers, OHEP has made it that much harder to deal with steadily decreasing funding.
The requirement that proposals coincide with “programmatic priority judgments” may be a serious flaw. These may result in over-subscription to some projects which have been clearly identified
as a priority while other fine physics projects will be neglected. A prime example here is the
substantial US expertise in heavy flavor physics which is unable to sustain a long standing collaboration with European groups. Another example is the substantial physics potential which will be lost in locating the LBNE far detector on the surface. Are the accomplished advocates of these physics goals expected to leave the program? Yes, I am well aware that there are real financial constraints but these have always existed. In the past the programmatic decisions have been more directly in the hands of physicists. We had been able to sustain a broad program at the forefront of all areas of high energy physics.
A pitfall of comparative ranking can occur when most proposals are of high quality. In this case, the rankings are essentially random, based on small differences in perceived quality. Proposal evaluation should be based, at least in part, on an “absolute” scale. To deal with the unfortunate situation of continually decreasing funding for HEP, it may be necessary to make programmatic choices, but taking into account broader criteria that would strengthen US science. Such criteria could include diversity among people and institutions.
After the first year of comparative reviews, there seemed to be a policy decision against support of university-based research scientists, despite widespread recognition in the university community of the importance of such positions to the success of the research mission of OHEP. If this is indeed a policy decision, it would be better to explicitly state it along with a plan for an alternative way to provide the needed full-time, experienced, stable scientific leadership.
There has always been a need for a more transparent funding process in OHEP. Basic information about each grant should be publicly and conveniently available, as is the case for NSF programs