Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs don't really know why they got something wrong, so unless it had access to the original chain of thought, it's just guessing.
 help



They don’t have access to their network level. But I assume they actually do have access to their chain of thoughts.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: