Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes.

It’s trained on completing the text.

If an expert write a long test and you and "in summary: " at the end, the model will complete with something approximating truth (depend on size of model, training, etc)

Humains do a similar things. We have a model in our head of the subject discussed and we can summarize, but we will forget some parts, make errors, etc. GPT is very similar.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: