Scientists always want access to more data. But what if offering less were the way to achieve that goal?
The National Science Foundation (NSF) is pondering that seemingly paradoxical approach as part of rethinking Science and Engineering Indicators—the agency’s massive biennial statistical bible covering everything from spending on research and education to regional development, trade, and public attitudes toward science.
First issued in 1972, Indicators is a product of a mandate from Congress to compile “a report on indicators of the state of science and engineering in the United States.” The tome has become an important source book for policymakers who set the nation’s scientific priorities and for a community of researchers, educators, lobbyists, journalists, and others who use the data.
But those who produce and use Indicators have begun to wonder whether the current version may be more than Congress needs—or can possibly digest. For evidence, they point to the fact that what began as a slim, 145-page report has ballooned into a 1500-page behemoth (see graphic).
Last week NSF sponsored a 2-day workshop, attended by some 70 people familiar with Indicators, on ways to improve its scope, content, accessibility, and timeliness. Meeting around the corner from NSF headquarters in Arlington, Virginia, workshop participants acknowledged that they have helped feed the beast by asking NSF to include an ever-expanding universe of metrics. And although NSF officials didn’t ask for a consensus on any changes, they did get plenty of suggestions.
One idea was that everyone could gain if the report goes on a diet. Several participants argued that a svelter Indicators would let the staff of NSF’s National Center for Science and Engineering Statistics (NCSES), which compiles the report, spend more time being a clearinghouse and service center for the research community. As one participant proclaimed, “Let NCSES be NCSES.”
A slimmer Indicators would also fit better with how people interact with data, explained Vinton Cerf, chief Internet evangelist at Google and a member of the National Science Board, NSF’s oversight body and the official publisher of Indicators. “We can’t anticipate everybody’s needs,” Cerf said. “So we need a platform to give people better access to the underlying data, as well as ways to combine it with data they already have, to run their own analyses.”
Cerf suggested that NSF fund a small research program—“maybe $5 million a year, or even less,” he guessed—aimed at designing that platform. Andrew Reamer, a research professor at the Institute of Public Policy at The George Washington University in Washington, D.C., thought NSF’s ultimate goal should be “to help others produce their own Indicators.”
The 2016 edition that debuted in January is digital-only, a change some view as a first step down that path. But it was clear from the workshop that plenty of obstacles remain.
Some researchers thought that freeing NCSES from the tyranny of a print production schedule should improve the report’s timeliness—some of the time series end 3 to 4 years before the date of publication—and possibly even lead to continual updates. But as NCSES staff noted, in many cases newer data simply don’t exist. Some workshop participants also recoiled at the thought of a rolling report. “Indicators is supposed to be a snapshot of the world at a particular point in time,” said Chris Hill, professor emeritus of public policy at George Mason University in Fairfax, Virginia.
Despite its heft, Indicators remains far from comprehensive. Workshop participants had no trouble listing topics that the current report ignores, including the important contributions of philanthropy and venture capital in fueling scientific discovery and innovation and the impact of educational technology on student learning. The report is also silent on factors that impede scientific progress, such as the growing burden on universities to comply with federal regulations governing the conduct of research.
And then there are nonpublic sources of data that could shed light on the scientific enterprise but remain off-limits to researchers because of privacy or proprietary concerns. “Even if you query NSF and get an answer, you don’t really know what else is out there,” one participant noted in summing up the problem.
Of course, the world isn’t standing still as NSF tinkers with Indicators. Participants noted that tight budgets already may be forcing some federal agencies to cut back on surveys and other forms of collecting data that wind up in the report. And if NSF truly wants to reduce the size of the report, what will be scrapped? “How do we say no, and to whom?” asked one worried NCSES staffer, hinting at the possible political repercussions of excluding something that legislators deem important.
Fortunately for NSF, the report’s solid reputation and the absence of any outside pressure gives it the luxury to proceed with any changes at its own pace. But change is coming, hints Kelvin Droegemeier, retiring vice chair of the board and head of its Indicators committee. “The board is very interested in seeing if the congressional intent can be met while freeing up resources to strengthen Indicators,” he told participants before they headed out the door. “What we do next will be informed by what happened here.”