February 10, 2015
Yong Chen, an assistant professor of computer science and director of the Data-Intensive Scalable Computing Lab, will lead a team of researchers to develop a new concept called “compute on data path” toward “data-centric” computing that assimilates and analyzes more and different types of data used in scientific discovery and does so all at one time.
“This is a sizable grant awarded from a very competitive NSF core program, and we deeply appreciate the support to our work from the NSF and the recognition from our peers,” Chen said. “We are primarily doing research on trying to address data-intensive scientific computing needs to create 'data-centric computing' for better scientific discovery and innovation.”
Chen said he and other scientists will lay groundwork for a new data assimilation computing concept capable of combining data that may not be similar.
“At this stage, this is more about a methodology development than creating an actual supercomputer,” he said. “This is more how an investigation to see whether a new concept is feasible and whether a change can be made to current software stack to make it a more data-centric way to have significantly better productivity in scientific discovery.”
Supercomputing has become a popular and useful tool to conduct computer simulations and data analysis for scientific discovery in many fields, including climate sciences, healthcare, biology, chemistry and astrophysics. The problem is, Chen said, the methods used to conduct simulations and analyses are “computing-centric.”
As data volume grows over time in this “computing-centric” model, the data floods into the computer system, creating a bottleneck of information.
“The traditional computing-centric method is not really the best way for today's 'data-intensive' scientific discovery,” he said. “This three-year project will develop new concepts and methodologies of 'data-centric' solutions. We're going to model computations and data as objects and move the computation objects to the data objects instead of moving the data to the computations. We will try to make the computations happen right in place with the data for better performance, efficiency and productivity in scientific discovery via computer simulations and analyses.
“If everything goes well, this could make a significant impact to the supercomputing field.”
The Edward E. Whitacre Jr. College of Engineering has educated engineers to meet the technological needs of Texas, the nation and the world since 1925.
Approximately 4,300 undergraduate and 725 graduate students pursue bachelors, masters and doctoral degrees offered through eight academic departments: civil and environmental, chemical, computer science, electrical and computer, engineering technology, industrial, mechanical and petroleum.Twitter
The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense"
With an annual budget of about $6.9 billion (FY 2010), they are the funding source for approximately 20 percent of all federally supported basic research conducted by America's colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.
View a 2 minute video overview of NSF's mission and focus.