NSF awards $4.5 million grant to build platform for geospatial data management
A team led by ITaP Research Computing Senior Research Scientist Carol Song has been awarded a five-year, $4.5 million grant from the National Science Foundation to build a “plug and play” platform to allow researchers to easily access and process geospatial data.
Song describes GeoEDF as a successor to the Geospatial Data Analysis Building Blocks (GABBs), a project Song led that developed web-based geospatial data visualization, analysis and modeling tools and made them accessible to users on the science gateway MyGeoHub. GABBs is open source and is available to anyone, regardless of affiliation with Purdue. A geospatial gateway enabled by GABBs software can be set up on cloud computing platforms such as Amazon Web Services.
Despite the advent of geospatial data processing tools accessible even to non-programmers, data challenges remain in this area. Many geospatial data repositories lack standard interfaces and don’t provide data in a way that researchers can immediately use. Moreover, as field sensors become increasingly common, large volumes of streaming data are created, including so-called “crowdsourced” data generated by citizen scientists. GeoEDF’s data processing pipeline will help researchers retrieve and process only the data they need, and transform it into standardized formats.
Song has a number of scientists as co-principal investigators who will serve as use cases for GeoEDF. One of her co-PIs, Jian Jin, an assistant professor of agricultural and biological engineering, is developing a handheld crop scanner that will allow farmers to get information about the health of their plants just by scanning a leaf. GeoEDF will include a way to automatically upload and store the data generated by use of these sensors, as well as data analysis tools that can be used to study plant health and growth.
Song’s other co-PIs are:
- Venkatesh Merwade, a professor of civil engineering, who will use GeoEDF for flood modeling with a state-of-the-art hydrologic model.
- Uris Baldos, a research assistant professor in agricultural economics, who will use GeoEDF to integrate socio-economic data with environmental data to study the consequences of changing land use.
- Jack Smith, a senior research staff member of the Center for Environmental, Geotechnical and Applied Sciences at Marshall University, who will use GeoEDF to process water quality data from field sensors in Appalachia and convert it into standard EPA format for processing.
In addition, the GeoEDF team has a number of partners in industry and the government to collaborate on the development and use of the GeoEDF tool. Research scientists at the EPA’s National Exposure Research Laboratory who do large scale hydrologic modeling are interested in collaborating with GeoEDF to incorporate remotely sensed field and satellite data.
GeoEDF will have interoperability with other national geospatial cyberinfrastructures, including Hydroshare, an open source system for sharing hydrologic data and models. This interoperability will ensure users can seamlessly leverage the capabilities of different infrastructures.
Like MyGeoHub, GeoEDF will be built on HUBzero, Purdue’s ITaP-developed cyberinfrastructure that now powers more than 60 interactive, web-based hubs driving research and education in fields such as nanotechnology, cancer treatment, pharmaceutical manufacturing, volcanology, environmental modeling, biofuels, and the bonds between humans and companion animals.
Writer: Adrienne Miller, science and technology writer, Information Technology at Purdue (ITaP), 765-496-8204, email@example.com
Last updated: September 18, 2018
- How Purdue's spam filtering works
- Know your BoilerKey, features to help prevent, solve lockouts
- Gilbreth supercomputer helping Purdue faculty make leaps in machine learning, artificial intelligence, data science
- Weber cluster available for high-performance computing with controlled research
- Envision Center VR app teaches construction workers not to fall
- Purdue team chosen to compete in international supercomputing competition
- ITaP Newsroom 2018