How do consumers react after learning that an online bank account has been hacked? Do they take their business elsewhere? Do they limit their online activities to reduce their exposure to such invasions?
Those were some of the questions that intrigued Rahul Telang, a professor of information systems and management at Carnegie Mellon University (CMU) in Pittsburgh, Pennsylvania, who studies the economics of information security. With data breaches an increasingly common problem, he suspected the behavior of hacked consumers could be having a significant impact on global commerce. But Telang didn’t have enough preliminary data to win a grant to study the issue from the National Science Foundation (NSF), which last year funded only 22% of the nearly 50,000 proposals it received.
Fortunately for Telang, NSF offers a funding mechanism that supports the type of exploratory research he wanted to conduct. And this spring Telang received $200,000 to analyze how customers of one major financial institution actually responded to real data breaches. (The firm agreed to share a vast amount of anonymized data with the researcher.)
Telang’s research is being funded by NSF’s EArly-concept Grants for Exploratory Research (EAGER) program, which eschews the agency’s usual reliance on outside peer reviewers and puts the agency’s program staff in the driver’s seat. It’s one of the easiest paths to NSF funding, at least on paper, with success rates topping 90%, according to a recent NSF analysis. Last year, for example, 399 of 441 EAGER proposals were funded. In recent years, the program has had an even higher batting average—95% in 2011 and 91% in 2012.
Despite those overwhelmingly favorable odds, the 5-year-old program doles out only one-fifth of what some senior NSF officials think the foundation should be spending on EAGER grants. At a time when scientists are turning over every rock in search of federal funds, ScienceInsider wondered why the program was so undersubscribed. The answer seems to be the absence of outside peer reviewers—generally considered the gold standard for awarding federal basic research grants. Many NSF program officers seem to be uncomfortable with that alteration to merit review. And so a mechanism designed to encourage unorthodox approaches is languishing because it is seen as going too far.
“Outside your comfort zone”
NSF’s website describes EAGER as a vehicle for “untested, but potentially transformative, research ideas or approaches.” Such "high risk-high payoff" research, it explains, often entails “radically different approaches, applies new expertise, or engages novel disciplinary or interdisciplinary perspectives.”
That’s an apt description of Telang’s response to a “Dear Colleague” letter inviting EAGER proposals under NSF’s Secure and Trustworthy Cyberspace (SaTC) initiative. “It forces you to go outside your comfort zone,” he says about the solicitation. “So I went out of my way to create a partnership [with a CMU colleague, computer scientist Artur Dubrawski]. I had been thinking of the problem, but not specifically as something that would involve others outside my discipline.”
Heng Xu, program director for the SaTC initiative with NSF’s division of social and economic sciences, is glad he did. “It’s the first time a team has ever had access to such a longitudinal database with transactional data as opposed to self-reported behavior or computer simulations,” she explains. “The project is unique.”
“Unique” is an important word for NSF program officers who must weigh an EAGER proposal. The exploratory research the program hopes to fund, they confess, isn’t always easy to define. “We don’t really know what it is, but we can recognize it when we see it,” one veteran staffer says.
Then-Director Arden Bement launched EAGER in 2009 as one of two programs that would rely on in-house reviews. (The second, much smaller, program, called RAPID, allows scientists to respond quickly to research opportunities afforded by disasters or other unexpected events.) He urged NSF program officers to spend up to 5% of their budgets on them. But last year EAGER’s share was 0.9% of all NSF-funded research, or $64 million, and the foundation-wide total for both programs has never topped 1.1%.
Although success rates are uniformly high, the program’s popularity varies widely across NSF’s seven research directorates. At the bottom is NSF’s social, behavioral, and economic sciences (SBE) directorate, which received only 11 EAGER proposals last year—and funded 10 of them. In contrast, the computing and information sciences directorate (CISE) topped the list by reviewing 171 proposals—all but six of which were funded.
Erwin Gianchandani, deputy director for the division of computer and network systems (CNS) within CISE, which teamed up with SBE on the SaTC letter, says EAGER gives NSF a way to test the waters before deciding whether to issue a full solicitation on any particular topic. “We use it to gauge community interest before we launch a formal program,” he explains.
It can also be used to nudge researchers in a particular direction. “We’re a relatively young, rapidly evolving discipline,” Gianchandani says about CNS, “and EAGER lets us try out new research threads.”
“It has to be exceptional”
The EAGER mechanism is not for everybody. With a $300,000 cap and a 2-year duration, the awards are one-third smaller and 1 year shorter than the typical NSF grant and, thus, insufficient for scientists looking to maintain their core research program. They also can’t be renewed. Instead, any follow-up research must be funded through one of NSF’s regular programs.
At the same time, EAGERs offer some advantages. The applications are shorter—some five to eight pages instead of the usual 15 to 20 pages for a standard grant. They also feature a quicker turnaround time, as program officers don’t have to convene an outside panel and wait for its judgment on the value and soundness of the research proposal. Instead, a group of NSF staffers vets the application and moves it up the chain of command.
Outside reviewers would have been superfluous on his EAGER proposal, Telang believes. “I’m not sure they would have had much to contribute,” he says. Unlike with a standard proposal, he says, “this is not a case in which we had already done 25% of the work, and they could evaluate it. I didn’t have the resources to start working with the data set.”
But the absence of outside reviewers may also be the program’s Achilles’ heel. Because NSF’s current merit review system is so highly regarded, program officers are inherently skeptical of anything that takes the broader scientific community out of the equation, notes Jeryl Mumpower, division director for the social and economic sciences.
“Program officers are often faced with making very difficult judgments about a proposal,” he says. “Is it sufficiently exploratory? Is the research high-risk? Is the idea really novel? These are tough decisions, and we have a system that we know works well. So there’s a tendency to be pretty conservative.”
That attitude sets a very high bar for alternative approaches, Mumpower and others say. “Program officers are very reluctant not to involve the community,” he says. “It has to be an exceptional case. Maybe the right answer isn’t 5%. The empirical results suggest that it’s a much smaller number of cases that are appropriate for this special approach.”
Don Rice, a 20-year veteran in the division of ocean chemistry within NSF’s geosciences directorate, says that “peer review is the absolute foundation for determining the best investment of our limited funds.” EAGER awards should be reserved for situations, he says, in which “a PI [principal investigator] has a brilliant new way to tackle a problem, but lacks the resources to collect the preliminary data needed to show that it’s possible.” The relative scarcity of EAGER awards within the directorate—51 proposals, of which 49 were funded—reflect that narrow definition, he says.
Bement, who left NSF in 2010 and returned to Purdue University in West Lafayette, Indiana, where he is now director emeritus of its Global Policy Research Institute, agrees that such views are widespread among NSF staffers. “We promoted it while I was there, but we didn’t get many takers,” he acknowledges. “The purpose of EAGER is to encourage high-risk research at the frontier, and there may be more opportunities to do that in some disciplines than in others. At the same time, frontier research is in the eyes of the beholder.”
Those eyes may actually be more discerning than NSF’s statistics suggest. The 90% success rates don’t include ideas floated by PIs that program officers have nipped in the bud. In fact, there’s much more back-and-forth between researchers and NSF program staff on EAGER proposals than on regular submissions. Scientists thinking about submitting an EAGER proposal are required to talk first to a program officer, for starters. Some NSF programs go even further, requiring a brief description of the proposed research and then weeding out those ideas deemed not a good fit.
NSF doesn’t keep any statistics on that give-and-take, however, so it’s impossible to know what proportion of initial EAGER pitches are ultimately successful. That’s also true for the much smaller RAPID program, which reported a nearly perfect 98% success rate last year on 123 proposals. (RAPID awards, which have a $50,000 cap, are even more narrowly defined; they are meant to give scientists a chance to collect data for a suddenly available research opportunity, like the aftermath of the 2010 Deepwater Horizon oil spill in the Gulf of Mexico.)
One step at a time
A breach of financial data may not seem to have much in common with an exploding oil rig. But they do share one feature that makes them appropriate for NSF’s special funding mechanism: Each represented a tabula rasa for curious researchers. And the EAGER award allowed Telang to start the clock.
“Once I received the grant, the first step was to hire a student and have him look at the data,” Telang explains. “Then we’ll ask if we can survey these consumers. And that will depend on the willingness of the financial institution to provide us with access.
“What’s so useful about these data,” he continues, “is that they will tell us how customers reacted after being notified about a breach. Did they close their account, or reduce how often they used it? Or did they decide to stop doing mobile banking?”
Such a data-driven approach is well-suited to the goals of the EAGER program, says NSF’s Xu. “They may be able to use the data to develop a theory in behavioral economics and cybersecurity, which are relatively new fields,” she explains. “That’s why we thought it was appropriate for an EAGER award.”
The award’s shorter duration also gives Telang more leeway to chart the course of future research, an important consideration for an academic scientist. “I’m hoping that it will end up in a nice paper. But until we are done with this first phase, it’s hard to say whether I will want to follow up.”