Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> hallucination ... it’s just doing what LLMs do

So using that term shows the need to implement "processing of thought", as decently developed human intellects do.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: