Energy efficient storage system for Cloud Computing
Driven by the growing interest of Cloud computing and the demand for high performance computing infrastructures, it has led to the construction of large-scale computing datacenters consuming enormous amounts of electrical power. As reported by a recent survey, 75% of the energy grows in Asia pacific is contributed from cloud providers. The increasing energy consumption not only become the major operation cost of the cloud provider, but also raise environmental concern for public. Currently it is estimated that disk storage systems consume about 35 percent of the total power used in data centers. This percentage of power consumption by disk storage systems will only continue to increase, as data intensive applications demand fast and reliable access to on-line data resources. In this talk, we present an idea of using disk scheduling algorithms to reduce the energy consumption of a storage system, and discuss some other energy efficient datacenter designs and approaches.
Jerry Chou received a Ph.D in Computer Science from University of California San Diego in 2009. He is currently an assistant professor at National Tsing Hua University. Prior to join NTHU, he worked in the data management group at Lawrence Berkeley National Lab. His research interests involve various topics for cloud and distributed systems, including high-performance computing, dynamic resource management, load balancing and energy efficient algorithm.