Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Search works well for this today, no need for continuous learning

Not even sure how you envision continuous learning, but if you mean model updates, I'm not sure the economics work out

 help



Actually claude has memory files now so it has some sort of learning, I think it will improve over time and they should survive a model update.

putting stuff in markdown files is not "learning", it's called taking notes, like we've done for 1000s of years

I guess when I was in class and took notes, then reviewed them later I wasn't "learning" anything.

That later "learning" part is updating weights in your brain

What Ai's get is a cheat sheet for the session


That's what I mean by continual learning, skills, memory are a crutch until real learning can happen, which could be weights changing in the local instance.

And my point is that weight changes are not likely to have the economic ROI for their justification on a person-by-person basis

What you are suggesting is a very expensive late-training phase activity. It's also not clear anymore when fine-tuning helps or hurts. Progress is rapid


I see, I misunderstood your original message. Given how much progress has been made without it, It's perhaps not necessary especially if the economics make it prohibitive.

Reading notes is only necessary because of how lossy human memory is. Reading notes doesn't give you new information, it just reinforces memory paths ... which will fade and you'll have to read the notes again later unless you frequently apply the knowledge, which again reinforces those paths (but lossily, so the bits of information not repeatedly used will fade, and you will again have to read the notes if you need those bits ... or just to re-mind yourself what they were).

Socrates made a similar complaint about the invention of writing, itself.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: