You don't need to understand how neurons work in detail to be able to use them to do something. In the past, we were able to use electricity for various purposes without knowing about electrons.
But my point is: have we really reached a technological level where we can use neurons like replaceable car parts? That video seems to suggest yes, but I’m still skeptical.
My impression is that this company is offering a product that’s still beyond our technological capabilities, much like the cold‑fusion startups that pop up from time to time.
I vibecoded a clone of HN, for philosophical topics, which I'm very fond of. However, I have done a terrible job of marketing it, so that is still on my list: http://forum.philosofriends.com
Right now is basically a repository of philosophy-related links I find interesting, but it would be awesome to find a way to start generating philosophical discussions of the quality I find in HN for tech/AI
I'd assume that if he got a call from Patrick himself and a second opportunity to get interviewed, that's already a cue for interviewers to pass him regardless of what he says?
> They could have done better. They chose the path of least resistance, putting in the least amount of effort, spending the least amount of resources into accomplishing a task
You might as well tell reality to do better: The reality of physics (water flows downhill, electricity moves through the best conductor, systems settle where the least energy is required) and the reality of business (companies naturally move toward solutions that cost less time, less money, and less effort)
I personally think that some battles require playing within the rules of the game. Not to wish for new rules. Make something that requires less effort and resources than Electron but is good enough, and people will be more likely to use it.
Shaming the use of electron? I'll do that every day and twice on sunday. Same with nonsense websites that waste gigabytes on bloat, spam users with ads, and feed the adtech beast. And I'll lay credit for this monument to enshittification we call the internet at the feet of Google and Facebook and Microsoft.
Using electron and doing things shittily is a choice. If you're ever presented with a choice between doing something well and not, do the best you can. Electron is never the best choice. It's not even the easiest, most efficient choice. It's the lazy, zero effort, default, ad-hoc choice picked by someone who really should know better but couldn't be bothered to even try.
Cosmopolitan can be used to bundle up any gui package, and your code, and a team of professional software devs should be able to cope with it just fine. You end up with a native executable with a slightly bigger package to ship, since it's carrying executables for various platforms, but you'd effectively have the same code and behavior everywhere. A few extra megabytes, instead of whatever the hell electron is doing.
They could also use Java, or even one of the electron type clones that attempt to be better, like Tauri.
The point isn't that electron is so awful. It's that the company with the purportedly best coding AI and one of the best overall AI models in the world chose to do the absolute tawdriest, cheapest, even laziest thing without any consideration of what the right thing to do might be, or what thing they could do that demonstrated their excellence and mastery of craft, or at a bare minimum, the advanced capabilities of the AI.
Cursor used claude to build a browser from scratch; it's not like their AI couldn't do it.
I don’t code, so I’m well out of my league but this point of “premier coding AI company should showcase its capabilities by using their own model to build superior software” rings true to me, right? Especially as we start to discuss AI as more dangerous than nuclear weapons
yet it can’t even do that?
It might be a strange thing to say, but Java is still viable alternative route. You can build a nice and fast cross-platform desktop application on it today. The language was designed for this kind of things. The entry barrier is quite high though.
As far as I can tell after a quick Google, you can't share your Qt UI with the browser version of your app. Considering that "lite" browser-based versions of apps are a very common funnel to a more featureful desktop version, it makes sense to just use the UI tools that already work and provide a common experience everywhere.
The same search incidentally turned up that Qt requires a paid license for commercial projects, which is surprising to me and obviously makes it an even less attractive choice than Electron. Being less useful and costing more isn't a great combo.
> you can't share your Qt UI with the browser version of your app
You can with WASM (but you shouldn't).
> Qt requires a paid license for commercial projects
It doesn't, it requires paid license if you don't want to abide with (L)GPL license, which should be fair deal, right? You want to get paid for your closed-source product, so you should not have any reservations about paying for their product that enables you to create your product, right? Or is it "money for me, but not for thee"?
> Being less useful and costing more isn't a great combo.
Very nice, but now explain why you are talking about using Qt to create apps, whereas grandparent talks about experience with apps created with Qt.
I looked up the WASM Qt target and it renders to a canvas, which hampers accessibility. The docs even call out that this approach barely works for screen readers [0], and that it provides partial support by creating hidden DOM elements. This creates a branch of differing behavior between your desktop and browser app that doesn't have to exist at all with Electron.
It should go without saying that the requirements of the LGPL license are less attractive than the MIT one Electron has, fairness doesn't really come into it. Beyond the licensing hurdles that Qt devotes multiple pages of its website to explaining, they also gate commercial features such as "3D and graphs capabilities" [1] behind the paid license price, which are more use cases that are thoroughly covered by more permissively licensed web projects that already work everywhere.
On your last point I'm completely lost; it's late here so it might be me but I'm not sure what distinction you're making. I guess I interpreted dmix' comment generally to be about the process of producing software with either approach given that my comment above was asking for details on alternatives from the perspective of a developer and not a user. I don't have any personal beef with using apps that are written with Qt.
I do frontend work so struggle to get over how bad most Qt GUIs are. They are far out of date compared to Gnome or MacOS in a lot of the small widget details and menus.
Plus I use Mac these days and Qt apps just never looked right on that platform.
As a recent toe-dipper into linux (now running Arch on a powerful minipc and KDE plasma) I'm shocked at how little progress has been made on the native UI side.
Well, it's not that surprising, considering that as soon as something radically new appears, there is always some mistreatment from all sides: platform owners, app developers and users.
Windows' Metro/Modern UI was pretty good from different perspectives, but didn't have enough effort put into it to make it a universal thing fit for multiple purposes (half of the Windows settings was still in Control Panel for quite some time), wasn't familiar for users (so they hated it) and wasn't familiar for developers (so they created hideous apps).
In the opposing Linux camp, GNOME made Gtk 4 with Libadwaita UI library with "everything is a phone app" mindset that not every app can adopt. For example, there's no application menu (a line with File, View, Edit, etc.) component shipped by default, you should make it yourself or get it somewhere. So now GIMP is developed using Gtk 3 (not modern 4) because it has all the components GIMP needs. Trying to get GNOME developers to implement some stuff outside of their vision is a futile effort.
Please do continue to waste energy on doing something that will do nothing but allow you to feel superior about yourself. In fact, you will probably waste more energy than Electron ever has.
People can change but based on Facebook's actions vis-a-vis privacy, mental health, etc. there's little evidence that Zuckerberg has gone from treating his users like "dumb f...." to treating them like human beings.
If we're going to talk about quotes, here's one: "money amplifies who you are".
Whatsapp is one of the only instances I can think of in corporate acquisitions where the side being acquired lashes out at the acquiring side as much as this ("It's time. Delete Facebook")
You're talking about someone who changes privacy settings, who was told about gay people being automatically added to groups and posting on their walls so it outed them, being told about this and dismissing it. Or "graph search". He doesn't think people deserve any respect when it's not him?
When a man changes it is on him to prove that he has changed. Has Zuck atoned himself in any way? Has Meta?
I'm a big believer in second chances and letting people rehabilitate, but there's no evidence the Meta or Zuck have changed for the better. Meanwhile, *there is plenty of evidence that suggests he has only become more uncaring and deceptive, as Meta has only become more invasive over time*, the article itself being one such example.
So I do believe Zuck has changed, but not in the direction that we should applaud and/or forgive him. I've only seen him change in the way that should make us more concerned and further justify the hatred. A man may change, but he does not always change for the better.
Not sure if you read the headline on that site, but it says "bad idea."
I never said OpenClaw was a bad idea.
I said the way most people are using it now isn't practical and/or saving them any time, and if there were ways, I would love to hear about them.
This is part of why the whole discussion has been so low value: people always default to "yep you're going to be proven wrong one day" or "you'll just be left behind then" instead of showcasing an actual, real life, practical example of using it to be more productive.
If you think it's fun and enjoyable, then have at it. I'm just not the biggest fan of people wasting a bunch of time on novelty and then telling me I'm dumb for not doing the same.
I'm switching over to Claude from OpenAI, and I don't care. OpenAI's image generation is terrible anyway. Just try to get it to generate something to scale, like a cabinet for a specific kitchen or bathroom space. Give it all the explicit constraints, initial sketches, etc. it wants.
The results are laughably bad.
Sure, it does get some of the tones and features, but any kind of actual real-world constraint is so far off, and the dimension indicators it includes are hilarious if they weren't so bad.
That will cover the physical infrastructure of your Internet provider. But there are a lot of websites and software on the internet that require either ads or payment to survive. Free usually means "surviving with somebody else's money aka investors"
reply