A mildly interesting report has just come out which looks at the power consumption of computer servers in the US, the report by Lawrence Berkeley National Laboratory finds that severs and all the paraphernalia required to keep them working, (refrigeration etc) uses up 1.2% of the energy used in the US. The figure outside the US was 0.8%. This is actually a lower limit it as it doesn't include custom built servers like the ones Google has thousands of.
I can well believe that kind of number having visited the room where they store the supercomputers used by the Astronomy group, most of the power actually seems to be spent trying to keep the damn things cool, there must be some way to use this energy more usefully. Perhaps using the waste heat from the supercomputers to heat the buildings or something. Does anyone have any bright ideas for how we could profitably use kilowatts of waste heat? I promise not to steal and patent the idea. (Unless its really good.)
I can well believe that kind of number having visited the room where they store the supercomputers used by the Astronomy group, most of the power actually seems to be spent trying to keep the damn things cool, there must be some way to use this energy more usefully. Perhaps using the waste heat from the supercomputers to heat the buildings or something. Does anyone have any bright ideas for how we could profitably use kilowatts of waste heat? I promise not to steal and patent the idea. (Unless its really good.)
No comments:
Post a Comment