Scientific studies on climate helped establish...
Scientific studies on climate helped establish a consensus. (Photo credit: Wikipedia)
BY JOHN MARKOFF

 
Creating a supercomputer that can model the future of the planet is perhaps the most daunting challenge facing climate and computer science experts.

The task would require running an immense set of calculations for several weeks and then recalculating them hundreds of times with different variables. Such machines will need to be 100 times faster than today’s supercomputers. If such a computer were built today, a so-called exascale computer would consume electricity equivalent to 200,000 homes and might cost $20 million annually to operate, contributing to global warming.

For the reason, scientists are waiting for low-power computing techniques capable of significantly reducing the power requirements for an exascale computer.

But Krishna Palem, a computer scientist at Rice University in Texas, believes he has found a shortcut. By stripping away the transistors that add accuracy, it will be possible to cut the energy demands of calculating while increasing performance speeds, he claims.

“Scientific calculations like weather and climate modeling are generally, inherently inexact,” Dr. Palem said. “We’ve shown that using inexact computation techniques need not degrade the quality of the weather-climate simulation.”

To create models, scientists turn the world into a three-dimensional grid and compute the equations. Current climate models used with supercomputers have cell sizes of about 100 kilometers, representing the climate for that area of Earth’s surface. To accurately predict the long-term impact of climate change will require shrinking the cell size to a kilometer. Such a model would require more than 200 million cells and roughly three weeks to compute one simulation of climate change over a century.

“We can’t do a lab experiment with the climate,” said Tim Palmer, a University of Oxford climate physicist. “We have to rely on these models which try to encode the complexity of the climate, and today we are constrained by the size of computers.”

Dr. Palem says computing the rate of global warming may be possible with a computer that would use specialized low-power chips to solve a portion of the problem. He describes his approach as “inexact” computing.

The stated goal of the engineers who are trying to design an exascale computer is to stay within a power budget of 30 megawatts, experts say.

Dr. Palem has been imploring the computing world to back away from its romance with precision for more than a decade. He has recently developed allies among climatologists like Dr. Palmer, who in the journal Nature recently called on the climate community to form an international effort to build a machine fast enough to solve basic questions about the rate of global warming.

Not everyone is convinced Dr. Palem’s computer architecture ideas will be applicable. “For consequential problems, where inexact results could cause a bridge to be mis-designed, or erroneous conclusions about the mechanics of climate, the inexactness is problematic,” said John Shalf, department head for computer science at the Lawrence Berkeley National Laboratory.

Dr. Palem and Dr. Palmer are trying to overcome these objections.

“It’s a trivial amount of money when you think of climate impact being in the trillion of dollars,” Dr. Palmer said. “It’s actually an existential question. If it’s at one end of the spectrum, we can adjust, but if it’s at the other of the spectrum, we’re not going to come out of it unless we cut emissions in the next decade.”


Taken from TODAY Saturday Edition, The New York Times International Weekly, May 23, 2015

3 comments:

http://financeaccounting.uonbi.ac.ke/ said...

nice article.
http://financeaccounting.uonbi.ac.ke/

Unknown said...

thats a great work
http://business.uonbi.ac.ke/

Anonymous said...

This article amaizing and so good.
Great article. Thanks!!!
cheap auto insurance san antonio


Post a Comment