The UHPC project does not expect the technologies proposed to be available until 2018, and it does not even care so much if the underlying technologies that would be deployed in a UHPC ExtremeScale system get widely commercialized. But DARPA is adamant that it wants to cram the system, its networking, its storage, and its cooling into a cabinet that is 24 inches wide by 78 inches high and 40 inches deep - a little wider and taller than a standard server rack.
70 page pdf on the design goals
DARPA also wants the system to deliver 50 gigaflops per watt on the Linpack benchmark test, with a peak performance of one petaflops. That 57 kilowatt power budget is essentially what is required to completely run the box. The system has to chew on data coming in from a massive streaming sensor array, do single- and double-precision floating point math that is compatible with IEEE754 standards as well as 16-bit, 32-bit, and 64-bit integer math. The UHPC ExtremeScale system will need something on the order of 10 bytes per flops, or 10 petabytes
The Nvidia Fermi GPGPU has about 3 gigaflops per watt.
AMD firestream 9250 had 8 gigaflops per watt in single precision
IEEE Computing - Energy-Efficient Computing For Extreme-Scale Science (10 page pdf)