Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can radiate easily.

But not convect. Hence why it’s much, much harder than removing heat on Earth.



Of course you're not convecting but if you are radiating from a hot body into an ambient two Kelvin then you are going to lose heat really, really fast. IIRC heat loss by black body radiation into its surroundings is proportional to the fourth power of the temperature difference between the body and surroundings (from memory, and going back a very long way, so maybe incorrect).


> proportional to the fourth power of the temperature difference between the body and surroundings

Almost. It’s proportional [0] to T_hot^4 - T_cold^4. For a 100C surface with emissivity 1, that’s about 1kW/m2 if there is no radiation coming back, which really isn’t very high. You cannot cheat this with fancy folded-up radiating surfaces (it’s thermodynamically impossible, and the actual mechanism that kills it is one fin of the heatsink radiating right at the next one).

So cooling in space is hard. You’re not getting GPU-like power densities without a physically immense radiating surface extending way past those GPUs.

[0] Caveat: emissivity can depend on wavelength, and the law holds independently at each wavelength. So this can introduce interesting effects, which is how all the fancy prototype roof-cooling materials work, and it’s also related to how “spectrally selective” windows and window films work.


Thanks, you clearly know your subject. I must say that without doubting your figures, I am astonished at how little heat can be lost radiatively. Given the 4th powers I assumed it must be vastly higher, but clearly not, much against my intuition. Thanks for a good answer!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: