Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Marazan
8 months ago
|
parent
|
context
|
favorite
| on:
A non-anthropomorphized view of LLMs
aAnthrophormisation happens because Humans are absolutely terrible at evaluating systems that give converdational text output.
ELIZA fooled many people into think it was conscious and it wasn't even trying to do that.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
ELIZA fooled many people into think it was conscious and it wasn't even trying to do that.