Jun 20 2013
Data Analytics

Government and Higher Education Team Up to Tackle Big Data

Big Data creates opportunities to share critical information and leverage expertise of those who can make sense of it all.

Since the late 1990s, experts at the Applied Physics Laboratory (APL), a division of Johns Hopkins University, have monitored diseases throughout the world. And since 2012, researchers at APL have been attempting to predict outbreaks before they begin.


Mentioned in this story:

@JHUAPL

@DroughtCenter

The program, known as PRedicting Infectious Disease Scalable Method (PRISM), crunches about 220 gigabytes of data to reach its conclusions, says project manager and co-developer Anna L. Buczak. Sponsored by the U.S. Department of Defense’s Joint Project Manager Transformational Medical Technologies, the program is currently monitoring the development of dengue fever outbreaks in Peru and the Philippines, as well as outbreaks of malaria in South Korea.

The information can help government and public health officials make the best use of scant resources, according to Sheri Lewis, manager of the APL’s Global Disease Surveillance program, a university-affiliated research initiative. “They could then target their mitigation so they could spray in a particular geographic area or do enhanced public information campaigns,” Lewis says.

The federal government relies on institutions of higher education to collect, analyze and present the critical data.

A tremendous amount of data is now available for collection and analysis, with new potential for public benefit. In many cases, the federal government relies on institutions of higher education to collect, analyze and present the critical data that agencies use to perform their missions.

While the federal government has access to high-powered computers that are capable of processing the burgeoning amounts of data created each day, it is often higher education institutions that provide the necessary expertise, flexibility and nuance to truly make sense of it all. In the case of disease prediction, Lewis says, the APL uses computers that can hold large amounts of information, but nothing close to that possible with supercomputers. And the data that is analyzed, such as rainfall totals, temperature, socioeconomic information and facts about local sanitation systems are, for the most part, generally available. What the APL provides is professional expertise and context, Lewis says.

Keeping the Public Informed

The National Drought Mitigation Center at the University of Nebraska–Lincoln has produced the weekly U.S. Drought Monitor since 1999, in collaboration with several federal, state and local agencies. Released each Thursday at 8:30 a.m. EST, the Drought Monitor compiles a week’s worth of information to produce a map used by the National Weather Service, the U.S. Department of Agriculture and the Weather Channel and others to share the latest updates on conditions. The online version receives more than 3 million page views per year, says Mark Svoboda, a climatologist and University of Nebraska–Lincoln faculty member. Svoboda developed the monitor in collaboration with Doug LeComte, a scientist with the Climate Prediction Center of the National Oceanic and Atmospheric Administration, and with the U.S. Department of Agriculture.

In the program’s early years, the resulting map was developed using just a few points of data, including stream flow, temperature, vegetation and precipitation, and measurements were taken at only a few hundred locations. Today, data and measurements are gathered from more than 6,000 locations throughout the country and then used to create a color-coded, information-dense map, primarily through the assistance of Geographic Information System (GIS) software. Svoboda says a server farm with more than 50 terabytes of disk space is used to perform the work.

Chad McNutt, deputy program manager for the National Integrated Drought Information System, says it’s likely that the program has enjoyed greater flexibility than it might have, had it been run by the federal government alone.

It’s a classic example of pooling resources, of sharing data and working together to provide a tool.

The expertise of Svoboda and other scientists makes the map truly valuable, McNutt says, adding that the weekly Drought Monitor represents more than the sum of its data.

Drought is “an insidious thing” that’s not always obvious, McNutt says. To fully understand its implications, “you need an expert assessment. If you want to be able to reflect the conditions, you need some human intervention.”

And that’s just what Nebraska’s National Drought Mitigation Center provides.

“It’s a classic example of pooling resources, of sharing data and working together to provide a tool that was basically made to enhance the visibility of drought,” Svoboda says. “We serve as an unbiased, outside party.”

<p>iStockphoto/Thinkstock</p>
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT