Visitors Now: | |
Total Visits: | |
Total Stories: |
Story Views | |
Now: | |
Last Hour: | |
Last 24 Hours: | |
Total: |
As I sit here typing away, Amazon.Com’s Cloud Player serves up 320 tunes I’ve purchased over the past year and a half. I can play them anywhere, any time, on any computer with Internet access. I don’t have to lug around my laptop or even a flash drive. What’s not to like?
Our greener friends worry about all the power consumed by the data centers that deliver computer services over the Internet. Think of all the emissions!
A year-long New York Times investigation summarized in Saturday’s (Sep. 22) edition (“Pollution, Power, and the Internet“) spotlights the explosive growth of the data storage facilities supporting our PCs, cell phones, and iPods — and the associated surge in energy demand. According to The Times:
The impact of the Internet — or, more broadly, the proliferation of digital technology and networks — on energy consumption and greenhouse gas emissions has been a contentious topic since 1999, when technology analyst Mark P. Mills published a study provocatively titled “The Internet Begins with Coal” and co-authored with Peter Huber a Forbes column titled “Dig more coal: The PCs are coming.”
Mills and Huber argued that digital networks, server farms, chip manufacture, and information technology had become a new key driver of electricity demand. And, they said, as the digital economy grows, so does demand for super-reliable power — the kind you can’t get from intermittent sources like wind turbines and solar panels.
Huber and Mills touted the policy implications of their analysis. To wire the world, we must electrify the world. For most nations, that means burning more coal. The Kyoto agenda imperils the digital economy, and vice versa.
Others — notably Joe Romm and researchers at the Lawrence Berkeley National Laboratory (LBNL) — argued that the Internet was a minor contributor to electricity demand and potentially a major contributor to energy savings in such areas as supply-chain management, telecommuting, and online purchasing.
Although Mills’s “ballpark” estimates — 8% of the nation’s electric supply absorbed by Internet-related hardware and 13% of U.S. power consumed by the all information technology — were likely much too high in 1999, they may now be close to the mark. On the question of basic trend and direction, Mills was spot on.
Critics scoffed at Mills’s contention that, in 1999, computers and other consumer electronics accounted for a significant share of household electricity consumption. Ten years later, in Gadgets and Gigawatts, the International Energy Agency (IEA) reported that in many OECD country households, electronic devices — a category that includes televisions, desktop computers, laptops, DVD players and recorders, modems, printers, set-top boxes, cordless telephones, answering machines, game consoles, audio equipment, clocks, battery chargers, mobile phones and children’s games — consumed more electricity than did traditional large appliances. The IEA projected that to operate those devices, households around the world would spend around $200 billion in electricity bills and require the addition of approximately 280 Gigawatts (GW) of new generating capacity by 2030. The agency also projected that even with improvements foreseen in energy efficiency, consumption by electronics in the residential sector would increase 250% by 2030. Saturday’s New York Times article further vindicates Mills’s central insight (even if not his specific estimates).
Jonathan Koomey, one of the authors of the LBNL critique of Mills’s 1999 study, estimates that, nationwide, data centers consumed about 76 billion kilowatt-hours in 2010, or 2% of U.S. electricity use in that year. In a Forbes column published last year, Mills opined that if we factor in three other components of “digital energy ecosystem” — (1) the energy required to transport the data from storage centers to end users, (2) the “electricity used by all the digital stuff on desks and in closets in millions of homes and businesses,” and (3) the energy required to “manufacture all the hardware for the data centers, networks, and pockets, purses and desktops” — then the digital economy’s total appetite “is north of 10% of national electricity use.”
The Times laments that data centers “waste” vast amounts of power. On a typical day, only about 6% to 12% of a center’s computing power is actually utilized, yet most of the facility’s servers will be kept running around the clock. To call that wasteful, however, is to confuse the engineering concept of efficiency with the economic concept. In economics, what matters is value to the consumer. Consumers demand reliable, uninterrupted access to data. Keeping all the servers humming ensures the center can handle unexpected peaks in demand without crashing. A center that saves energy but bogs down or crashes will lose customers or go out of business. As one industry analyst told The Times, “They [data center managers] don’t get a bonus for saving on the electric bill. They get a bonus for having the data center available 99.999 percent of the time.”
Obviously, it is in a center’s interest to find ways to provide the same (or greater) value to consumers at lower cost, including lower energy cost. But, notes Mills, efficiency tends to increase consumption, not reduce it:
Car engine energy efficiency improved 500 percent pound-for-pound from early years to the late 20th century. Greater efficiency made it possible to make better, more featured, safer, usually heavier and more affordable cars. So rising ownership and utilization lead to 400 percent growth in transportation fuel use since WWII. The flattening of automotive energy growth in the West is a recent phenomenon as we finally see near saturation levels in road-trips per year and cars-per-household. We are a long way from saturation on video ‘trips’ on the information highways.
Efficiency gains are precisely what creates and increases overall traffic and energy demand; more so for data than other service or products. From 1950 to 2010, the energy efficiency of information processing improved ten trillion-fold in terms of computations per kWh. So a whole lot more data-like machines got built and used — consequently the total amount of electricity consumed to perform computations increased over 100-fold since the 1950s – if you count just data centers. Count everything we’re talking about here and the energy growth is beyond 300-fold.
Fundamentally, if it were not for more energy-efficient logic processing, storage and transport, there would be no Google or iPhone. At the efficiency of early computing, just one Google data center would consume more electricity than Manhattan. Efficiency was the driving force behind the growth of Internet 1.0 as it will be for the wireless video-centric Internet 2.0.
So what’s the solution? Where Mills once argued that the “Internet Begins with Coal,” he now argues that “The Cloud Begins with Coal (and Fracking)“:
Some see the energy appetite of the Cloud as a problem. Others amongst us see it as evidence of a new global tech boom that echoes the arrival of the automotive age. We’re back to the future, where the microprocessor today as an engine of growth may not be new, anymore than the internal combustion engine was new in 1958. It’s just that, once more, all the components, features and forces are aligned for enormous growth. With that growth we will find at the bottom of this particular digital well, the need to dig more coal, frack more shale….
2012-09-26 00:27:30
Source: http://www.globalwarming.org/2012/09/25/cloud-computing-friend-or-foe-of-kyotoism/