Hacker Newsnew | past | comments | ask | show | jobs | submit | hobofan's commentslogin

Yes they did, but the social bump that was there shortly after release has significantly calmed down already.

It did rekindle my love for the game, but most outposts are empty, even in the international districts, so I think it's hard to get hooked on it for new joiners.


> Tech companies are in the business of nurturing teams knowledgeable in things

It pains the anti-capitalist fibers in my body to say this, but no they are not. At the maximum the value is in organizational knowledge and existing assets (= source code, documentation), so that people with the least knowledge possible can make changes. In software companies in general, technical excellence and knowledge is not strongly correlated with economic success as long as you clear a certain bar (that's not that high). In comparison, in hardware/engineering companies, that's a lot more correlated.

In the concrete example of a legacy codebase we have here, there is even less value in trying to build up knowledge in the company, as it has already been decided that the system is to be discarded anyways.


> you would learn how things work and then write the code

In a legacy codebase this may require learning a lot of things about how things work just to make small changes, which may be much less efficient.


I might still be naive about the industry, but if you don't know how the legacy codebase works, you might either delegate the change to someone else in the company who does, or, if there is no one left, use this opportunity to become the person who knows at least something about it.

In the Azure Foundry, they list GPT 5.2 retirement as "No earlier than 2027-05-12" (it might leave OpenAIs normal API earlier than that). I'm pretty certain that Gemini 3, which isn't even in GA yet will be retired earlier than that.

That's true only in theory, but not in practice. In practice every inference provider handles errors (guardrails, rate limits) somewhat differently and with different quirks, some of which only surface in production usage, and Google is one of the worst offenders in that regard.

How does this compare to solutions like e.g. Clara[0] that have been around for a decade?

A lot of similar solutions came up in the early chatbot era, when Facebook published Ducking and it became trivial to parse dates from natural language. I also looked into building such a product in the time, but ultimately found it hard to find an entry to the market: Most people that actually need something like this do have secretaries (who will also schedule a lot of other things in regards to the meeting) and most other people that have a less severe form of that problem rarely want to actually pay for such a product.

[0]: https://claralabs.com


Great question! We have a lot of friends in the b2c space. What Vela is designed for is the subset of scheduling where nothing in the market works, specifically for businesses. Think a staffing firm coordinating across candidates, clients, recruiters, and client development to schedule interviews/meetings. Or another one doing 1,000+ interviews a week, wrangling across phone, SMS, and email. These are scenarios where companies tried every tool out there and eventually just did it themselves because tools couldn't meet their customers where they are and didn't handle the workflows/behaviors of their industries.

> but you'd be fighting against it, rather than being helped by it

I think this was also true of doing game development in Flash. Some people here might be looking back at Flash with rose-tinted nostalgia glasses.


I think Rive[0] is quite competitive with what was possible back then in covering the full authoring stack.

[0]: https://rive.app


You can nowadays do fine with macOS or Linux in most college degrees I've seen, since nowadays there are decent open source alternatives for the most prolific software that's on the level of popularity that it will be used in teaching.

However by default almost every college curriculum I've seen (unless it's in CS or IT combined field like bioinformatics) is still taught Windows-first, be it sociology, biochemistry or economics. In many you also have strong presence of MS Office suite, which is probably the first software that any university will buy license packs for for their students.


To me the price seems to be so uncharactaristically low for Apple during a time where hardware prices are rising across the board that this almost feels like an attempt to try and capture the desktop market. During a time where Microsoft is fumbling with Windows on every front, having a competitively priced Macbook even for budget-concious people seems like a smart move that will pay off even without direct high margins.

Capture the student market 100%. I’d buy one for my kids tomorrow. These machines are made with an iPhone chip so they’re going to be great at browsing the web and studying. I wouldn’t buy one for myself To do actual work on but for light users it’s the perfect device. Start them early and get them hooked in the ecosystem so they’re grow up and keep buying iPhones, Apple Watches, AirPods, and iPads.

You have to compare with the base iPad, which costs only about half of the Neo. The Neo adds a keyboard (but without Touch ID for the base model), a larger screen but without touch, a somewhat better but also binned SoC (which the next iPad refresh will very likely also get) and more storage. It seems roughly in line, relative to the price difference.

Interestingly it's cheaper than the iPad with the $250 magic keyboard.

It's more expensive if you want Touch ID, and on par ($350 + $250) if you don't. However, the $250 Magic Keyboard is heavily overpriced, the actual keyboard can't be more than $10-20.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: