Hacker Newsnew | past | comments | ask | show | jobs | submit | mountainriver's commentslogin

Also totally depends on what kind of coding you are doing.

Yeah building a web app can become somewhat easy, but is distributed machine learning? What about low level kernel code?


Still easy. You just have to learned different concepts. Someone in web needs to be familiar with databases, some protocols like HTTP, DNS, some basic Unix admin,.. Someone in low level kernel code will need to know about hardware architecture, operating system concepts, assembly,... But the coding skill stays the same mostly. You're still typing code, organizing files, and fiddling with build systems.

Different type of "coding skills" and different type of complexities make these two impossible to put into the same bucket of "still easy". You've probably never done the latter so you're under impression that it is easy. I assure you it is not. Grasping the concept of a new framework vs doing algorithmic and state of the art improvements are two totally different and incomparable things. In 30M population of software engineers around the globe there is only handful of those doing the latter, and there's a reason for it - it's much much more complicated.

You are conflating problem solving and the ability to write code. Web Dev has its own challenge, especially at scale. There’s not a lot of people writing web servers, designing distributed protocols and resolving sandboxing issues either.

I'm not conflating one with each other, I am saying that "coding skill" when dealing with difficult topics at hand is not just a "coding skill" anymore. It's part of the problem.

Not knowing C after a course on operating systems will block you from working on FreeBSD. Knowing C without a grasp on operating systems will prevent you from understanding the problem the code is solving.

Both are needed to do practical work, but they are orthogonal.


Exactly but they are not orthogonal as you try to make them to be. That's just trivializing things too much, ignoring all the nuance. You sound like my uncle who has spent a career in the IT but never really touched the programming but he nevertheless has a strong opinion how easy and trivial the programming really is, and how this was not super interesting to him because this is work done by some other unimportant folks. In reality you know, he just cannot admit that he was not committed enough, or shall I say likely not capable enough, to end up in that domain, and instead he ended up writing test specifications or whatnot. A classic example of Dunning-Kruger effect.

There is a nuance in what you say. You say it is "still easy" but it is not. It is not enough to take a course on operating systems and learn C to start contributing to the operating system kernel in impactful way. Apart from other software "courses" that you need to take such as algorithms, advanced data structures, concurrency, lock-free algorithms, probably compilers etc. the one which is really significant and is not purely software domain is the understanding of the hardware. And this is a big one.

You cannot write efficient algorithms if you don't know the intricacies of the hardware, and if you don't know how to make the best out of your compiler. This cannot be taught out of the context as you suggest so in reality all of these skills are actually interwhined and not quite orthogonal to each other.


I do agree with you that there's a skill tree for any practical work to be done. And nodes can be simple or hard. But even if there are dependencies between them, the nodes are clearly separated from each other and some are shared between some skill sets.

If you take the skill tree you need to be a kernel contributor, it does not take much to jump over to database systems development, or writing GUI. You may argue that the barrier entry for web dev is lower, but that's because of all the foundational work that has been done to add guardrails. In kernel work, they are too expensive so there's no hand holding there. But in webdev, often enough, you'll have to go past the secure boundary of those guardrails and the same skill node like advanced data structures and concurrency will be helpful there.

Kernel dev is not some mythical land full of dragons. A lot of the required knowledge can be learned while working in another domain (or if you're curious enough).


No, it's not mythical but it is vastly more difficult and more complex than the majority of other software engineering roles. Entry barrier being lower elsewhere is not something I would argue at all. It's a common sense. Unless you're completely delusional. While there's a lot of skills you can translate from system programming domain elsewhere there are not a lot of skills you can translate vice-versa.

Mac’s are still pretty terrible at running LLMs. They will be there someday, but that isn’t today

Honestly it’s been a breath of fresh air to have most of the gatekeeping in software be removed.

Seems that it was by and large just people wanting to feel important, and holding onto their positions.

Apps need great security, but security can also get out of control. Apps need good abstractions and code hygiene but that too can get out of control.

I’ve fallen in love with programming all of again now that I’m not so tied down by perceived perfection.


Everything is easy if you don't care about getting pwned, and you don't consider yourself responsible if this has negative effects on other people.

Is this satire?

Your model can absolutely improve

How would that work out barring a complete retraining or human in the loop evals?

Their whole game is just pump and dump

I got this issue too, but still found that a containerized desktop was superior due to resource efficiency

You must not know the stories of why GCP came to be.

It was an idea from the creators of Kubernetes and the execs at Google fought it the whole way


No, this is bunk. App Engine and GCE, the earliest components of GCP predate Kubernetes.

[I've been there for nearly all the relevant time]


I hadn't heard that, that's interesting. Any sources you'd recommend to hear more about it?

I think it's a slightly different point though. What I'm saying isn't about where the idea came from or whether it was part of some precient top down bet / strategy from the very beginning.

It's more where did the strategy evolve to (and why) and did they mess it up. GCP and Android are good examples of where it at a minimum became obvious over time that these were massively important if not existential projects and Google executed incredibly well.

My point is just that there's therefore good reason to expect the same of LLMs. After all the origin story of the strategy there has a similar twist. Famously Google had been significantly involved in early LLM/transformer research, not done much with the tech, faltered as they started to integrate it, course corrected, and as of now have ended up in a very strong position.


Interviewed at Sully, absolutely insane company, I’m sure they aren’t making a rational decision here


You have me curious, mountainriver. While I don't understand what you've written, I want to know more. Are you saying that open-source models can't be trusted as well as (some of the?) proprietary ones, and therefore aren't fit for "mission-critical" medical applications?


Same, also the features are poorly implemented.

This is due to the fact that they don’t well.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: