Let's Accelerate Scientific Progress by Bringing Supercomputing to Every Lab
Earlier this month, Ian Foster, widely recognized as one of the founders of grid computing, spoke at TedxCERN about science being a voyage of discovery. However, he is frustrated by what he perceives to be an unnecessarily slow pace of discovery.
Speaking to an audience of CERN IT members the day before his Tedx talk, Foster explained that a lack of access to IT infrastructures is hampering the rate of progress of many smaller laboratories. “I believe that there’s a tremendous amount of science that isn’t getting done today because labs lack the central cyber infrastructure,” he says. “Most labs can’t afford to build out the sophisticated and vertically integrated machinery that something like high-energy physics is able to use, so we’ve got to find new ways of delivering essential services to these people.”
“We have managed to create exceptional infrastructure for the 1%, but what about the rest?” asks Foster. “We have big science, but small labs. How do we deliver cyber infrastructure to small groups? They need something that is frictionless, affordable and sustainable.”
Foster, who is director of the Computation Institute, says that cloud computing offers an excellent way to deliver such services for science. Providing IT services in a simple, easy-to-use manner, is important, so as to make life simpler for scientists and thus allow them to concentrate on doing actual research, he says. “Cloud may be the latest buzzword, but it is also a very exciting realisation of the sort of things we’ve been working on for many years.”
Big data, another buzz phrase, lies at the heart of the problem, but also provides many exciting opportunities. As scientific research across all fields generates ever-increasing volumes of data, researchers have to be able to securely store and efficiently analyse this data. This is where tools such as Globus Online, a data management tool developed by a team including Foster, come in to play. This free, cloud-hosted service automates the activity of managing file transfers between supercomputing facilities, campus clusters, lab servers, and personal computers. It is now used by the recently launched Blue Waters supercomputer to bring in data, as well as by Earth System Grid and KBase, in the US.
However, Foster argues that providing researchers with access to such data management tools is just the start. We need to create a “discovery cloud”, he claims. This, he says, would facilitate scientific research by incorporating data management and analysis at all levels, as well as interacting with publication services and other vital scientific infrastructures. “We aim to provide more capability for more people at a sustainably lower cost by creatively aggregating and federating resources,” he explains.
Foster seeks to draw inspiration from the web 2.0 user interface and design principles of services such as Netflix, Flickr, Dropbox and Gmail, so as to make life easier for scientists at research institutes of all sizes and thus speed up the pace of scientific discovery. “We want to find a way that will allow us to deliver powerful resources and methods to everyone,” says Foster. “There’s even a lot more that citizen scientists could be doing if these services were made available to them, too,” he adds. “We basically need to provide what you might call science-as-a-service to all.” -- Andrew Purcel for i SGTW ©2013