Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes this is not local first, the name is bad.
 help



Horrible. Just because you have code that runs not in a browser doesn't mean you have something that's local. This goes double when the code requires API calls. Your net goes down and this stuff does nothing.

For a web developer local-first only describes where the state of the program lives. In the case of this app that’s in local files. If anthropics api was down you would just use something else. Something like OpenRouter would support model fallbacks out of the box

Not to mention that you can actually have something that IS local AND runs in a browser :D

In a world where IT doesn't mean anything, crypto doesn't mean anything, AI doesn't mean anything, AGI doesn't mean anything, End-to-end encryption doesn't mean anything, why should local-first mean anything? We must unite against the tyranny of distinction.

It absolutely can be pointed to any standard endpoint, either cloud or local.

It’s far better for most users to be able to specify an inference server (even on localhost in some cases) because the ecosystem of specialized inference servers and models is a constantly evolving target.

If you write this kind of software, you will not only be reinventing the wheel but also probably disadvantaging your users if you try to integrate your own inference engine instead of focusing on your agentic tooling. Ollama, vllm, hugging face, and others are devoting their focus to the servers, there is no reason to sacrifice the front end tooling effort to duplicate their work.

Besides that, most users will not be able to run the better models on their daily driver, and will have a separate machine for inference or be running inference in private or rented cloud, or even over public API.


It is not local first. Local is not the primary use case. The name is misleading to the point I almost didn't click because I do not run local models.

I think the author is using local-first as in “your files stay local, and the framework is compatible with on-prem infra”. Aside from not storing your docs and data with a cloud service though, it’s very usable with cloud inference providers, so I can see your point.

Maybe the author should have specified that capability, even though it seems redundant, since local-first implies local capability but also cloud compatibility, or it would be local or local-only.


It's called "LocalGPT". It's a bad name.

Yeah, it’s not exactly great lol. Could be the vision behind the project though, from an aspirational standpoint. But yeah, it kinda implies it will be more like ollama or vLLM.

To be precise, it’s exactly as local first as OpenClaw (i.e. probably not unless you have an unusually powerful GPU).

Yes but OpenClaw (which is a terrible name for other reasons) doesn't have "local" in the name and so is not misleading.

I mean, at least OpenClaw is funny in the sense that a D port could finish the roundabout by calling itself "OpenClawD"...

As misleading. Lots of their marketing push or at least thr ClawBros pitch it as running local on your MacMini.

To be fair, you do keep significantly more control of your own data from a data portability perspective! A MEMORY.md file presents almost zero lock-in compared to some SaaS offering.

Privacy-wise, of course, the inference provider sees everything.


To be clear: keeping a local copy of some data provides not control over how the remote system treats that data once it’s sent.

Which is what I said in my second sentence.

It’s worse than “[they] can see everything.” They can share it.

Is it not a given that anyone that gets access to a piece of information is also capable of sharing it?

Confused me at first as when I saw mention of local + the single file thing in the GitHub I assumed they were going to have llamafile bundled and went looking through to see what model they were using by default.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: