Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Whats confusing to me is the dual use of the word entropy in both the physical science and in communication. The local minimums are some how stable in a world of increasing entropy. How do these local minimums ever form when there's such a large arrow of entropy.

Certainly intelligence is a reduction of entropy, but it's also certainly not stable. Just like cellular automata (https://record.umich.edu/articles/simple-rules-can-produce-c...), loops that are stable can't evolve, but loops that are unstable have too much entropy.

So, we're likely searching for a system thats meta stable within a small range of input entropy (physical) and output entropy (information).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: