Cloud computing research awards

Two cloud computing research awards worth $10 million have been announced by the National Science Foundation (NSF). The projects will be called CloubLab and Chameleon, and will help academic researchers develop and carry out experiments with new cloud architectures.

Cloud computing is the practice of using remote servers to store, manage and process data, rather than keeping the data on a personal computer or local server.

Over the last five years cloud computing has become the number one method of providing computing infrastructure for online services.

Started with academic researchers

Initially, the concepts for cloud computing came from academic researchers. As it became more popular, industry took over the design of its architecture.

The NSF says the $10 million in awards will complement industry’s efforts. The aim is to create architectures that can handle a new a generation of pioneering applications, including safety-critical and real-time applications, like those used in transportation systems, power grids, and medical devices.

The new projects form part of the NSF CISE Research Infrastructure: Mid-Scale Infrastructure – NSFCloud program.

Suzi Iacono, acting head of NSF’s Directorate for Computer and Information Science and Engineering (CISE), said:

“Just as NSFNet laid some of the foundations for the current Internet, we expect that the NSFCloud program will revolutionize the science and engineering for cloud computing.”

“We are proud to announce support for these two new projects, which build upon existing NSF investments in the Global Environment for Network Innovations (GENI) testbed and promise to provide unique and compelling research opportunities that would otherwise not be available to the academic community.”

Chameleon

This project, located at the University of Texas at Austin and the University of Chicago, is a large-scale, reconfigurable experimental environment for cloud research.

It will consists of 650 cloud nodes (connection points) with a storage capacity of 5 petabytes (1 petabyte = 1015 bytes of digital information).

Researchers will configure slices of Chameleon as custom clouds using custom or pre-defined software to determine the usability and efficiency of several types of cloud architectures on a range of problems, from machine learning and adaptive operating systems to weather simulations and forecasting floods.

Chameleon team

(The Chameleon team

According to the NSF:

“The testbed will allow “bare-metal access”–an alternative to the virtualization technologies currently used to share cloud hardware, allowing for experimentation with new virtualization technologies that could improve reliability, security and performance.”

As its name implies, the Chameleon testbed is designed to adapt itself to a broad range of experimental requirements.

Kate Keahey, principal investigator for Chameleon, said:

“Users will be able to run those experiments on a large scale, critical for big data and big compute research. But we also want to go beyond the facility and create a community where researchers will be able to discuss new ideas, share solutions that others can build on or contribute traces and workloads representative of real life cloud usage.”

Chameleon is unique in that it supports a wide range of computer architectures.

CloudLab

APT
The computer cluster that provides the main hardware resource for Apt, an NSF-funded precursor to CloudLab. (Photo: NSF)

The second project, based at the University of Wisconsin, the University of Utah and Clemson University, supports the development of a large-scale distributed infrastructure on top of which engineers will be able to create several different types of clouds.

Each site will have its own, unique hardware, architecture and storage feature. They will link to each other via 100 gigabit-per-second connections on Internet2’s advanced platform, supporting OpenFlow and other networking technology software.

CloudLab’s head researcher, Robert Ricci, said:

“Today’s clouds are designed with a specific set of technologies ‘baked in’, meaning some kinds of applications work well in the cloud, and some don’t. CloudLab will be a facility where researchers can build their own clouds and experiment with new ideas with complete control, visibility and scientific fidelity.”

“CloudLab will help researchers develop clouds that enable new applications with direct benefit to the public in areas of national priority such as real-time disaster response or the security of private data like medical records.”

At its three data centers, CloubLab will provide about 15,000 processing cores and more than 1 petabyte of storage.

Each data center will have different hardware, allowing for further experimentation. The research team will be working alongside three vendors – Dell, Cisco and HP.