Nuclear power plant not included

by February 6, 2011

China's Tianhe-1A system, at the National Supercomputer Center in Tianjin

Now that the space race has been called off for lack of interest, great power competition of the peaceful variety has shifted to another kind of big iron: the development of the world’s fastest supercomputer. Although initially dominated by the United States, competition has heated up in recent years. According to the TOP500 project, the ten fastest computers in the world are operated by countries like Germany, Japan, France, the United States, and in spots #1 and #3, China.

Ever-increasing supercomputer speed is an easy phenomenon to comprehend (and to root for, if one is in a nationalistic mood), because it strikes us as being analogous to the increasing speed of personal computers, which have long benefited from the microprocessor industry’s success (so far) at adhering to Moore’s law. But the supercomputer race might be about to get boring. According to an essay by Peter Kogge in the February issue of IEEE Spectrum, a variety of physical constraints is about to push designers of these massive machines off of the “scale it up again” improvement path and into uncharted territory:

How far away is an exaflops machine? A decent supercomputer of the 1980s could carry out about a billion floating-point operations per second. Today’s supercomputers exceed that by a factor of a million. The reigning champion today is China’s Tianhe-1A supercomputer, which late last year achieved a world-record 2.57 petaflops—that’s 2.57 quadrillion (2.57 x 10^15) flops—in benchmark testing. Still, to get to exaflops, we have a factor of almost 400 to go.

The biggest obstacle to that by far is power. A modern supercomputer usually consumes between 4 and 6 megawatts—enough electricity to supply something like 5000 homes. Researchers at the University of Illinois at Urbana-Champaign’s National Center for Supercomputing Applications, IBM, and the Great Lakes Consortium for Petascale Computation are now constructing a supercomputer called Blue Waters. In operation, this machine is going to consume 15 MW—more actually, if you figure in what’s needed for the cooling system. And all that’s for 10 petaflops—two orders of magnitude less than DARPA’s exaflops goal.

If you tried to achieve an exaflops-class supercomputer by simply scaling Blue Waters up 100 times, it would take 1.5 gigawatts of power to run it, more than 0.1 percent of the total U.S. power grid. You’d need a good-size nuclear power plant next door.

Read the rest here.