Published originally at Mindwise
“It is possible to commit no mistakes, and still lose. That is not a weakness. That is life,” said then-Captain Jean-Luc Picard. This was exactly thirty years ago. (The episode aired on 10 July 1989.) But it’s worth remembering. It’s also timely: Picard returns to screens this Autumn, albeit as a retired Admiral. And because I don’t remember saying it at either Young Talent Grants Week or BSS PhD Day.
Briefly, that’s what academia is: doing everything right, getting good reviews, and then receiving a rejection letter anyway. It has happened to me, and—if you choose this life—it will happen to you. It’s part of the game. More important, though, is realizing that you can’t win if you don’t play.
Nobody wins at everything. Although this stings, my advice is to not ruminate on it when it happens. Pick yourself up, dust off your ego, and get back to doing whatever it is that you judge to be worthwhile. The meaningfulness of your work cannot be derived from the number of your acceptance letters.
Rejection as part of the game
Even if none of this sounds familiar, you will probably still have heard about the Rubicon-Veni-Vidi-Vici scheme sponsored by the Nederlandse Organisatie voor Wetenschappelijk Onderzoek (which is known in English simply as NWO). This is the primary policy vehicle by which the Dutch government supports academic research. And winning one of their grants can set up your career. So everyone applies. Unfortunately, though, there isn’t enough money to go around.
Rejection rates are sky-high: 95.7% this year, across the two stages of the Veni process (according to the letters I received). And the inter-rater reliability of the entire approach is practically zero. The difficulty of assessing such technical and specialized proposals then leads inevitably to false positives and false negatives: some projects get funded over others more deserving. Indeed, it is common to hear senior academics—including the NWO-voorzitter—describe the system as a kind of lottery. Tenure and promotion criteria reflect this too: rather than actually having to win the grants, emerging and early career scholars need only to have their projects deemed “fundable” (see our intranet for the policy in our faculty).
Still, the stress is unbelievable. I gave up a big chunk of my summer last year to the Veni pre-application, and most of my Christmas break was dedicated to responding to the resulting invitation. The reviews, fortunately, were good. (My pre-application was rated “excellent,” and one of my reviewers was “excited.”) In the time since my application, an EU policy related to my work also changed in a way that supported my proposal. But then I wasn’t invited to an interview. Because the letter informing me of this included the word “unfortunately,” it’s safe to say that the final judgment later this summer will be a rejection.
For me, this isn’t career-ending. I am fortunate to already have a tenure-track position. And I have been successful in applying for other grants. That is then perhaps why I was invited to speak about the Veni application process at Young Talent Grants Week: I’ve played the game enough times to have won and lost, so I can smile about it no matter the outcome. And because this summer will again be dominated by these applications, for many in this audience, I wanted to share some additional reflections on the process.
In short: these applications are hard on purpose. The word counts are insufficient, the format is constantly changing, the CV requirements are unusual, and the publication expectations are unpredictable. The focus on Knowledge Utilization, rather than Knowledge Translation, is also frustrating. This, though, is all part of the strategy; how granting agencies weed out lower-quality projects before they even start doing their peer reviews.
Think about it: if a large fraction of the applicant pool doesn’t think their proposal will be competitive, then they will conclude that it’s not worth the very substantial effort to apply. So there will be fewer applications to review, and therefore a lower rejection rate.
This is diabolical, because it means some inefficiencies are intentional. It’s also very clever, given the institutional goal of funding the right things: How do you solve a choice-problem where the decision-maker knows less, at the level of each project, than the individual experts who are encouraged to frame their applications in the best possible light?
Journal editors face the same problem. Everyone wants to present their research as ground-breaking, and therefore eminently publishable (and fundable). But that is rarely the case. Everything can’t be revolutionary, or the system of scientific knowledge production would collapse. Science needs stability, as well as change. So editors, reviewers, and funders struggle: with very limited resources to invest, and many different kinds of research to invest in, how can the overall value of the portfolio be maximized in an efficient way?
One tool that’s often used has nothing to do with the research itself. Instead, the idea is to optimize the stress of the submission process. This is explained in a new preprint authored by a team led by dr Leonid Tiokhin of the Human Technology Interaction Group at the Eindhoven University of Technology:
Submission costs de-incentivize scientists from submitting low-quality papers to high-impact journals via two mechanisms: differential benefits (e.g., low-quality papers are less likely to be published than high-quality papers) and differential costs (e.g., low-quality papers have higher submission costs than high-quality papers). Costs to resubmitting rejected papers also promote honesty. Without any submission costs, scientists benefit from submitting all papers to high-impact journals, unless papers can only be submitted a limited number of times.
In other words, the inconvenience of submitting is planned. The stress and frustration, strategic. Higher application costs in time and energy mean fewer applicants, which becomes fewer false negatives, fewer false positives, and a more efficient distribution of funds to those whose projects both need and merit the investment. (The limited number of submissions per applicant has also been adopted by NWO, and the European Research Council uses it too.)
You would be forgiven for thinking that that they don’t want your application. They don’t. And they make sure you feel that they don’t. Indeed, that’s the strategy described by Tiokhin and colleagues.
The trouble with this, for would-be applicants, is that just getting the reviews is a valuable part of the application process. Indeed, the reviews are so valuable that—if you’re ready to benefit from the feedback, and if the reviewers are honest and conscientious in providing it—the inconvenience of explaining your work in their way using their templates is often worthwhile. My advice regarding applying for grants is therefore the same as it is for writing-for-publication: aim for revise-and-resubmit, expect an unenthusiastic response, and then use the resulting expert commentary to make your next draft stronger. Even a complete misunderstanding shows you where your explanation needs to be clarified.
Allow me, on this basis, to reframe rejection: if you aren’t having your work rejected, it’s not because you’re a genius. It’s because you aren’t aiming high enough. So aim high, expect to miss, learn from the feedback, and then try again. To put it another way: play the long game. That’s how you turn rejection into acceptance; more importantly, it’s how to win at life.
J. T. Burman
JEREMY TREVELYAN BURMAN, PhD, is tenured Senior Assistant Professor (UD1 with indefinite contract) of Theory and History of Psychology at the University of Groningen in the Netherlands. The primary focus of his research is Jean Piaget, but he is also interested more generally in the formalization and movement of scientific meaning—over time, across disciplines, between languages, and internationally. To pursue these interests, he uses methods borrowed from the history and philosophy of science (esp. archival study) and the digital humanities (esp. network analysis).
Selected recent major works
Burman, J. T. (in press). The genetic epistemology of Jean Piaget. In W. Pickren (Ed.), The Oxford Research Encyclopedia of the History of Psychology. Oxford University Press.
Burman, J. T. (2020). On Kuhn’s case, and Piaget’s: A critical two-sited hauntology (or, on impact without reference). History of the Human Sciences, 33(3-4), 129-159. doi:10.1177/0952695120911576
Burman, J. T. (2019). Development. In R. J. Sternberg & W. Pickren, eds, The Cambridge Handbook of the Intellectual History of Psychology (pp. 287-317). New York: Cambridge University Press.