"Advanced civilizations probably have extensive cooling needs. Computing and communication equipment both work better at lower temperatures. A cooler computer means a faster computer with lower energy needs, and a cooler transceiver has lower thermal noise. Since these equipment cannot operate with perfect efficiency, they will need to eliminate waste heat. It's not too difficult to cool a system down to the temperature of the cosmic background radiation. All you need to do is build a radiator in interstellar space with a very large surface area, and connect it with the system you're trying to cool with some high thermal-conductance material. However, even at the cosmic background temperature of T=3K, erasing a bit still costs a minimum of k*T*ln 2 = 2.87e-23 J. What is needed is a way to efficiently cool a system down to near absolute zero. I think the only way to do it is with black holes. Black holes emit Hawking radiation at a temperature of T=h*c^3/(16*pi*G*M). With the mass of the sun, the temperature of a black hole would be about 10^-8 K. At this temperature, erasing a bit costs only about 10^-31 J. If you build an insulating shell outside the event horizon of a black hole, everything inside the shell would eventually cool down to the temperature of the black hole. However, it would not be necessary to build a complete shell around a black hole in order to take advantage of its low tempe"