Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is true. But only to a point where mimicking and more broadly speaking, statistically imitating data, are understood in a more generalized way.

LLMs statistically imitates texts of real world. To achieve certain threshold of accuracy, it turns out they need to imitate the underlying Turing machine/program/logic that runs in our brains to understand/react properly to texts by ourselves. That is no longer in the realm of the old school data-as-data statistics I would say.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: