> "(2) Don't use any Microsoft-owned UI toolkit, you'll get burnt"
This is 100% true for all of their techs produced within the past ~20 years, but WPF and Winforms are extremely stable with no real issues.
It's so weird too because most of everything they've done in the past 20 years has basically just been incomplete remixes of WPF. If they just stuck with WPF and extended it onward, something like a UI toolkit equivalent of C#, it would 100% be the gold standard for Windows development today, and perhaps even UI development in general if they open source/standarded it.
When they announced UWP I was just starting a new side project and I thought, let's check it out. I was hoping it would basically bring WPF into first class citizen territory. Instead, they made them needlessly incompatible. Like writing for both NeXT and MacOS, but on the same platform. I got discouraged right away and have really never done any significant Windows work since, which turned out to be a great move for my sanity.
Ahhm. At previous $DAYJOB, I inherited a WPF app written in 2012; I stumbled upon several WONTFIX bugs through the years - mostly having to do with shared memory bitmaps, having to manually call GC at times, and a host of other things.
Stable, but many issues. Stay away if you value your sanity and do anything nontrivial.
WPF and Winforms may be stable, but they're not going to work well on a modern machine with a HiDpi monitor. Same with Win32 controls.
My take: Use Win32 for opening windows and interfacing with the OS. Use a different toolkit for actually drawing inside the windows.
Ideally a toolkit that can paint in sync with window open and resize, otherwise you'll get Electron-style window flickering. And something that supports multiple windows in a lightweight way, since you're going to want popups, menus etc. to extend outside parent window bounds
the should just have updated wpf with their newer widgets and just continue to improve it and even make it cross platform. (basically what avalonia is doing)
Yeah this is the thing I think many don't want to see. Imagine a bunch of farm laborers being trained to use a tractor/reaper early on its development. Certainly they'd think it's cool and convenient, because it is. But if it works out, then most of those farm laborers are now obsolete, and a handful of them can now replace the rest. And indeed this is why agricultural employment went from the majority of jobs to a footnote.
The irony is that if LLMs live up to their potential then the value of software development as a skill is going to plummet, at least as far as something to do for others. I say it's ironic because obviously the people most interested in using LLMs for software development are software developers, and most are not working independently. It'd be like if we were all proactively getting involved in training our own replacements.
I was highly skeptical of this happening not that long ago, but I have to say that it seems increasingly likely. LLMs are still quite mediocre at esoteric stuff, but most software development work isn't esoteric. There's the viable argument that software development largely isn't about writing code, but the ability to write code is what justifies software developer salaries, because there's a large barrier to entry there that most just can't overcome. The 80/20 law seems to apply to everything, certainly here - 80% of your salary is justified from 20% of what you spend your time doing.
It's quite impossible to imagine what this will do to the overall market, because while this sounds highly negative for software developers, we're also talking about a future where going independent will be way easier than ever before, because one of the main barriers for fully independent development is gaps in your skillset. Those gaps may not be especially difficult, but they're just outside your domain. And LLMs do a terrific job of passably filling them in.
It'd be interesting if the entire domain of internet and software tech plummets in overall value due to excessive and trivialized competition. That'd probably be a highly disruptive but ultimately positive direction for society.
You're probably being downvoted because you're just assuming everybody knows what you mean. For somebody who doesn't it's going to be a nonsensical statement. The issue is that the government mandated banks enable easier access to loans for housing, as a means of resolving prior housing issues.
But what happens when you suddenly start giving a bunch of people the ability to go arbitrarily far into debt to buy something that they perceive as priceless (because housing, and most any other 'thing' tends to endlessly appreciate in value in an inflationary economic system)? Obviously prices skyrocket. The exact same thing happened with education for the exact same reason.
I think a practical argument against what you're saying here is simply that solving the mad max stuff doesn't require anything at all like this. The type of crime that's scary and impactful (e.g. terrorism is scary, but so extremely rare that it can't really be considered impactful) is generally trivial to bust.
Outdoor lighting in particular, at first at least. A fun historical anecdote on this one most don't know is the saying that somebody "can’t hold a candle to [x]". It's a reference to the old profession of link-boys who were mostly poor kids who'd carry a torch at night for people to see their way about, in exchange for a penny or two.
Imagine democracy playing out in literally any measurable field. Think about society getting to vote on who should be on a basketball team, but without any real knowledge of the candidates' abilities beyond what they said and advertised about themselves. And then we put the winners of the vote on a team. They'd get face-stomped by a D-tier NBA team pretty much always.
Democracy isn't about maximizing outcomes, because maximizing outcomes entails the possibility of minimizing outcomes. Marcus Aurelius was perhaps one of the best rulers in all of history. His son, Commodus - raised by him from birth, was certainly one of the worst rulers in all of history. Minority rule systems oscillate between extremes of the best of times and the worst of times. Democracy is always just kind of meh, never particularly great, never particularly awful.
But it creates a stable system because while it's meh in the present, you can always envision that things will be totally different in 4 years. Of course they won't be, but there's this weird bug in our psychology that we can't help but remain optimistic, even though in reality candidate after candidate it always feels like 'well it can't get any worse than this at least' and then the next guy is like 'hold my beer.'
I think that should be fairly obvious - money + ease of traveling to. America is, relative to the world, perceived as quite wealthy. South America is full of places that are quite poor. Put the two side by side and many guys coming here speaking not a lick of English, and with no skills to boot, probably envision themselves coming home rich.
It's even relatively easy to put yourselves in their shoes. Columbia's GDP/capita is about $8k. In the US it's about $80k. Imagine how you'd feel if Canada had a GDP/capita of $800k. To many people it'd seem like a great idea to move there completely regardless of everything else about the country. People warning you that you'll end up mowing yards and painting houses while making barely enough to put a roof over your head. Bah! Nonsense! How can that be true on $800k/year!? Canada, here I come!
You can see this play out the same in places like Saudi Arabia. Not many place have the taste for their policies, religion, or much of anything else - yet they have a massive immigrant population, far higher than the US (as a percent) precisely because they pay stupidly high wages, often tax free, and have a low cost of living. You can easily become a dollar millionaire teaching English there if money is what you're after simply because you can easily save thousands of dollars a month. And if you get bored you can go watch somebody get crucified for witchcraft on a weekend now and again.
I think it's very safe to assume that no major US based platform has 'real' E2E encryption. They're almost certainly all a part of PRISM by now, and it'd contradict their obligations to enable government surveillance. So the only thing that's different is not lying about it. Though I expect the other platforms are, like when denying they were part of PRISM, telling half truths and just being intentionally misleading. 'We provide complete E2E encryption [using deterministically generated keys which can be recreated on demand].'
Rather than an affinity for artisanal stuff or there being some bias against AI itself, I think it's simply that most stuff that's going to be made with AI is going to be very derivative. Even before AI you'd read posts from people, including on here, like 'I made a highly competent knockoff of [popular indie game] but got no sales. Woe is me.' But games aren't commodities. If people like a game, that doesn't mean they want to play, let alone buy, a complete knockoff of it.
The biggest barrier to success has always been having a good idea and AI is just going to make that ever more apparent, because you'll be able to cook up knockoffs ever more rapidly.
This is 100% true for all of their techs produced within the past ~20 years, but WPF and Winforms are extremely stable with no real issues.
It's so weird too because most of everything they've done in the past 20 years has basically just been incomplete remixes of WPF. If they just stuck with WPF and extended it onward, something like a UI toolkit equivalent of C#, it would 100% be the gold standard for Windows development today, and perhaps even UI development in general if they open source/standarded it.
reply