> When Claude writes tests for code Claude just wrote, it's checking its own work.
You can have Gemini write the tests and Claude write the code. And have Gemini do review of Claude's implementation as well. I routinely have ChatGPT, Claude and Gemini review each other's code. And having AI write unit tests has not been a problem in my experience.
yeah i have started using codex to do my code reviews and it helps to have “a different llm” - i think one of my challenges has been that unit tests are good but not always comprehensive. you still need functional tests to verify the spec itself.
The problem with red-light cameras is that enforcement becomes robotic. Robots are perfect—they don’t make mistakes (at least in theory), and they don’t show leniency. If policing is done by robots, then humans are expected to be infallible.
This is a complete non-issue. It's a traffic light, you are supposed to stop when it turns yellow! The yellow is the leniency. If you can't manage to stop before it turns red, you are either: 1) speeding, 2) driving a vehicle with defective brakes, or 3) mentally impaired. In all three cases you are a danger to fellow road users.
Besides, it's not a "the machine says so and not even the Supreme Court can overturn it" scenario. If there's genuinely a reason to cross into the intersection while the lights are red (such as there having been an accident, and a cop is temporarily managing traffic) the ticket will be waived. Heck, there will probably even be photographic evidence of it!
Most countries even have cops judge the tickets, just to already filter out those weird cases. The registration is done by a robot, but the policing is still done by a human.
Or you have a heavy, inbalanced object in your car you don't want sliding, something fragile in tow you don't want to have fast decelaration, or just don't have super-human reaction time since some light have extremely fast yellows.
Or, a deer jumped out on the side and you briefly looked away at it.
Or you could tell the driver behind you wasn't slowing down, so the safer option is to go.
Or. Or. Or. Real life is messy, and there's a million reasons to go though a yellow instead of slowing down.
> Most countries even have cops judge the tickets, just to already filter out those weird cases. The registration is done by a robot, but the policing is still done by a human.
This is common in the US as well. The machine takes the picture, filters out the illegible ones, and sends the rest to an actual officer who will issue the ticket.
> and they don’t show leniency. If policing is done by robots, then humans are expected to be infallible.
This is bad when applied to laws that were written with an exception of leniency and selectivity in enforcement, which is quite a lot of them. For running red lights though? I don't mind if the robots take you off the road automatically.
Running red lights? That's not all the cameras are used for. If are making a right turn on red and didn't come to a complete stop first you can get a ticket.
Okay? Rolling through a red light is dangerous whether you do it straight or to the right. Hell, the latter probably kills more pedestrians. I don't really mind holding drivers to high standards.
But why would you do that? Especially if you know there are robots enforcing that you come to a complete stop?
There are many places that don't even allow rights (or lefts) on red.
I got a right on red ticket once, and then I made it a point to obey the law -- especially at the intersections with the robots.
For things like traffic laws especially (where there are very simple cut and dry rules), why is it okay to break the law, and why is it not okay for robots to enforce the law?
Subjectivity in applying the law is a huge problem, especially given how corrupt and violent police are. Red light cameras remove police from the equation for that infraction and apply the law evenly. They also scale in a way that police just can't, and that's extremely important for safety.
I live in a city where red light running is an epidemic. Drivers flagrantly just don't stop, and it kills people all the time. Red light cameras - plus actually revoking drivers licenses, and then actually throwing people in jail for driving on suspended licenses - are the only way to fix this.
It's far past time that drivers are no longer immune to consequences for violent, sociopathic behavior.
Well, in 2019 an estimated 840 people died in the U.S.A. by red light running (<https://ncsrsafety.org/stop-on-red/red-light-running-fatalit...>). That's about 2.3 people a day, so last person killed by someone running a red light was statistically about 10.5 hours ago, the last one before that about 21 hours ago.
> If policing is done by robots, then humans are expected to be infallible
The reality is that the people doing the policing are counting on humans not being infallible
Fines have become an important revenue stream, that's why they are being automated.
Now that this is becoming more widespread, there's a perverse incentive for governments to maximize the difficulty in avoiding fines. Lower the speed limit on roads designed for higher speeds for "safety", etc
There are many citizens, like me, begging for red light cameras so something can be done about the rash of crashes and killings from willfully reckless drivers.
> Fines have become an important revenue stream, that's why they are being automated
Maybe we should legislate traffic fines out of existence, and just use points. Or at the very least the fines should never go back in any recognizable way to the budget of the police doing the enforcement.
The "content over chrome" trend was started by Microsoft's Metro design language. Windows 8 and Metro are one of the biggest UI/UX disasters since the dawn of computing. Why would Apple keep copying the worst ideas from Microsoft?
Metro worked perfectly well on tablets. And every OS since W8 has actually kept some version of Metro (in the form of e.g. larger touch-targets), because having a single version of Windows UI for both touchscreen and mouse-and-keyboard computers, is what enabled the creation of the "2-in-1" or "convertible" touchscreen notebook, a design that basically every modern Windows notebook instantiates.
Liquid Glass also makes more sense on tablets. I think Apple is copying Microsoft because Apple is also moving toward full UI-level unification between their desktop mouse-and-keyboard UI and their mobile/tablet touchscreen UI. They've already done it for some apps (e.g. Notes.)
MacBook Neo is getting a lot of attention for good reason. It is a great laptop. The fact that it isn't "convertible touchscreen" notebook doesn't seem to bother anyone.
Apple copying Microsoft is a mistake. It used to be the other way around.
The Windows 8 equivalent server edition also included the upgrade to Metro UI. I don't know, I guess MS figured IT wanted to provision Windows services using a surface tablet?
I actually really did like Windows Phones though. I can imagine a world with a third competitor in that space today... But MS didn't seem to have any understanding or ability to develop an ecosystem that works. Even when they were literally paying people to write apps for their app store, it was just terrible.
It worked so incredibly well on the Windows Phone 7, but translated horribly to the Windows 8 desktop. Especially the weird mouse gesture to get to the neutered Settings panel, the redundancy of that panel to begin with, and the entire UWP app experience. Windows 10 was a great marriage of these two concepts, even if the Settings menu was still redundant, it was functional. Then comes along Windows 11 even it's most recent feature updates feels like a half-finished UI.
That article was written in 2014, just a few years after the trend started, and still today, over a decade later, Apple, once famous for its UX, is still failing to follow it.
What puzzles me is that information like this is out there. How did Apple get it so wrong?
I am hopeful for the new UX VP. He has his work cut out for him.
> Why would Apple keep copying the worst ideas from Microsoft?
Remember also the "Get a Mac" ads that parodied Windows Vista permission dialogs, but now macOS is a permission dialog hell.
Tim Cook was an IBMer. I'm sure that Cook was a fine hire as an operations manager, but I doubt that Steve Jobs intended for someone like Cook to be in charge of everything at Apple, including UI design. (Jobs never put Jony Ive in charge of software, by the way, whereas Cook did.) Indeed, I doubt that Jobs groomed anyone to be his successor. By the time Jobs learned he had a fatal illness, it was too late, and he had to turn over the company to someone the board of directors would accept, which was Cook. Jobs was CEO but didn't own the company; infamously, the Apple board of directors chose John Sculley over Jobs in an earlier power struggle.
You are rewriting history. Any time Jobs had to step aside from the CEO position, Cook took over immediately. He was Jobs' designated successor for a decade when he learned he was sick. They merely implemented the succession plan they already had.
When Cook took over, he was unequivocally the only choice. He steered the company in his own direction, with a focus on operational health to the detriment of other things. He kind of lost the plot somewhere in there and has been spinning his wheels for a while. That's not what I'm contesting. It's your idea that Jobs didn't want Cook. Jobs loved Cook.
> Any time Jobs had to step aside from the CEO position, Cook took over immediately.
Any time Jobs had to step aside from the CEO position temporarily, Cook took over immediately. Metaphorically speaking, Cook kept the trains running on time. Cook did not set or change the direction of the company at the time, and Jobs was still available for consultation.
Sick is not the same as dying. Jobs initially didn't think he was dying, and tried to treat his illness with some hippie-dippie "alternative" medicine, when aggressive treatment might have saved his life.
> He was Jobs' designated successor for a decade when he learned he was sick.
Citation needed.
> Jobs loved Cook.
In what way? According to biographer Walter Isaacson, Jobs lamented that Cook was "not a product person".
Speaking for myself : it's a bit creepy and unsettling. Using brain cells is probably inching closer to consciousness than today's silicon is, and consciousness isn't well understood so I'd fear this line of research could eventually lead to the "I have no mouth and I must scream" the other commenter referenced. Many decades from now we might be wondering how much of a human brain needs to be grown in a lab before it's considered unethical.
Is that an issue only because these neurons are biological (still artificial because they are lab grown)? Silicon neurons could also become more powerful and lead to the "I have no mouth and I must scream". In fact, top tech companies are investing 100s of billions of dollars they year to make their silicon neurons more powerful.
Poor CEO my abs. When ChatGPT came out Microsoft was singing victory songs, and predicted Google's imminent death. 3 years later Google has one of the best models and Microsoft is still borrowing OpenAI's model. Not only that, Google is running their models on their own hardware, not Nvidia's.
One of the things that a CEO drives is vision and innovation.
Sundar misses the mark on these things. AI is a good example. Google invented the transformer architecture, but simply published it for its competitors to use. It took a code red in 2023 to finally push Google to develop products based on this.
Cloud. Years late to the game. All it would have taken is a letter similar to the famous Bezos memo to eventually get all of Google's world-class scaled infra pointing externally and generating revenue. Instead, Google Cloud started late, and couldn't reuse much of the internal infrastructure.
Stadia, another example. That architecture is probably the future. It's not clear how gamers in developing countries are going to afford thousands of dollars in hardware that sits idle 90% of the time.
> Google invented the transformer architecture, but simply published it for its competitors to use.
That's how innovation works in this industry. If companies didn't allow researchers to publish their work it would set us back decades. Researchers building on each other's work is how this industry was built.
> It took a code red in 2023 to finally push Google to develop products based on this.
So Google executed. Ability to execute is one of the things that makes a good CEO. Other CEOs have additional qualities such as vision, and getting others to believe in the vision. But not every CEO needs to be a Steve Jobs!
Plenty of innovations are coming out of Google, just look at Nano Banana Pro for example.
Google invented the basis of LLMs, but under Pai failed to come up with the idea of ChatGPT. Getting Gemini into a workable state required the return of Page and Brin. It seems to be working out for Google, but how they got here is a very big mark against Pichai's leadership.
1. Proprietary Data (Youtube, docs, gmail, cloud logs, waymo, website analytics, ads, search, the list is huge)
2. Commercial Datacenters (theyre ahead at least)
3. Chip production (Google is manufactoring proprietary chips)
4. Consumer OS (Chrome, Andriod)
5. Consumer Hardware (Pixel)
Basically google has access to data that OpenAI will never have access to, can lower costs below what OpenAI can, and is already a leader in all the places OpenAI will need massive capex to catch up.
You can't train LLMs on proprietary data, at least not if you want to make that LLM as accessible as Gemini. Otherwise random people can ask it your home address.
So it matters less than one would think. Also, ChatGPT can do 'internet search' as a tool already, so it already has access to say Google maps POI database of SMBs.
And ChatGPT also gets a lot of proprietary data of its own as well. People use it as a Google replacement.
>You can't train LLMs on proprietary data, at least not if you want to make that LLM as accessible as Gemini. Otherwise random people can ask it your home address.
If this is your only criteria I think you have a misunderstanding of what proprietary data is and ways companies can mitigate the situation in the inference stage.
What if the CEO isn't just telling the company how much to invest, but also has influence on how that money is used? Google's relative success, if it exists, I'd rather not judge, isn't from investing more than everybody else. Because the money just keeps pouring into these things, for all contenders.
This is a major challenge to Microsoft. A 13-inch Surface Laptop costs $899 [1], that's 50% more than an equivalent MacBook! And even at that higher price the Surface Laptop doesn't have a good screen: it uses 150% scaling (as opposed to the ideal 200%) which means you have subtle display artifacts.
Other than Microsoft nobody even makes decent laptops in the Windows world. I am typing this on an Lenovo Yoga, it has decent screen and keyboard, but the touchpad is horrible. Samsung makes good laptops but my keyboard gave out after just 2 years. Most other laptop makers have horrible industrial design. Dell XPS 17 was pretty good, but now they have weird keyboard.
The best laptop is now significantly cheaper than the horrible ones. Incredible achievement by Apple, and a major challenge to Windows laptop makers.
I was recently in the lookout for a new laptop. I wanted something BEEFFY! Specs wise but 13 inch at most.
I literally couldn't find anything on the PC side. I wanted an x86 because I prefer Linux Mint as my OS (didn't care about windows) , but it was impossible to find a good laptop with good GPU , more than 64gb ram and decent build materials (ive got a thinpad and the platic build is just terrible. The screen bends when pulling it to open the laptop).
So, if settled for a 128gb ram M4 max Macbookpro. It has been pretty solid so far. I'm a power user, so the RAM is used quite a lot (one of the reasons I wanted x86/Linux was to avoid virtualization overhead in docker/podman).
Macs are way more expensive than other laptops, but their level of tech sophistication is miles ahead of anyone.
I am a longtime Windows user and it brings me absolutely no joy to report that the M4 I am forced to use for work runs the Rust compiler a good bit faster than the big fancy gaming PC I just got with a 9800X3D.
Rust literally compiles ~4x faster on WSL than on the Windows command line, on the same hardware, so try that and see. Also set up the mold or wild linker as well as sccache, although sccache is OS agnostic so you can use it on macOS too. Make sure your code is on the WSL side not on /mnt/c which is the Windows side though, that will kill compilation speed.
That has not been my experience at all; I get pretty much the same times on the same machine on Linux and Windows. Something weird has happening to that person. Someone mentioned Defender, and that could certainly be it, as I have it totally disabled (forcibly, by deleting the executable).
You shouldn't have to go through all these extra steps just to squeeze out the same performance you would get by just installing [Other OS].
At some point I realized I was spending hours at a time trying to 'fix' Windows and decided to give Mac a try right around the time that Apple Silicon came out. It was a night and day difference.
CrossOver lets me play most single player games just fine on my M4 Pro, and personally I found multiplayer gaming taking too much of my time and emotional energy anyway.
WSL is fantastic - apart from the fact that you need to clear it intermittently via disk compression. I use it for work and it's great until you get something incredibly frustrating, like needing a pass-through for your hardware.
The one thing I can say with my macbook as someone who's switched from a decade of windows, is that stuff tends to just work, minus window swithcing.
I'd wager that's more likely due to Windows than the hardware. Like sure the hardware does play a part in that but its not the whole story or even most of it.
My C++ projects have a python heavy build system attached where the main script that runs to prepare everything and kick off the build, takes significantly longer to run on Windows than Linux on the same hardware.
Afaik a lot of it is ntfs. It’s just so slow with lots of small files. Compare unzipping moderately large source repos on windows vs. POSIX, it’s day and night.
A big part of it is that NT has to check with the security manager service every time it does a file operation.
The original WSL for instance was a very NT answer to the problem of Linux compatibility: NT already had a personality that looked like Windows 95, just make one that looks like Linux. It worked great with the exception of the slow file operations which I think was seen as a crisis over Redmond because many software developers couldn’t or wouldn’t use WSL because of the slow file operations affecting many build systems. So we got the rather ugly WSL2 which uses a real Linux filesystem so the files perform like files on Linux.
I don't know about ugly. Virtualization seems like a more elegant solution to the problem, as I see it. Though it also makes WSL pointless; I don't get why people use it instead of just using Hyper-V.
Honestly, just cause it's easier if you've never done any kind of container or virtual os stuff before. It comes out of the box with windows, it's like a 3 click install and it usually "just works". Most people just want to run Linux things and don't care too much about the rest of the process
Try adding your working directory to the exclusions for windows defender, or creating a Dev Drive instead in settings (will create a separate partition, or VHD using ReFS and exclude it from Windows defender). Should give it a bit of a boost.
Apple buries this info but the memory bandwidth on the M series is very high. Doubly and triply so for the Pro & Max variants which are insanely high.
Not much in the PC line up comes close and certainly not at the same price point. There's some correlation here between PCs still wanting to use user-upgradable memory which can't work at the higher bandwidths vs Apple integrating it into the cpu package.
They don't bury it. It's literally on the spec page these days. And LPCAMM2 falls somewhere between the base M and Pro CPUs while still being replaceable.
The new MacBook Neo is a less than half the memory bandwidth of the base model MacBook Air.
This shouldn't be surprising. macOS has a faster filesystem+VFS than Windows, and the single thread perf of the M4 beats most PC cpus. I'm not sure what linker rust uses, but the apple native ld64/ldPrime is also pretty fast as far as linkers go.
Windows is also slow enough at forking, that clang has "in-process CC1" mode because of it.
This is how opinions differ. IMO plastic is better than aluminium. It is robust (if done right), lighter and doesn't have good thermal conductivity (which makes laptop usage possible, MacBooks can be uncomfortable for lap usage if too hot).
I have an Air. Maybe active cooling prevents it from getting too hot. With the Air, the metal body is kind of the heatsink.
I can configure my Snapdragon plastic laptop such that the fan doesn't turn on, so the body being metal isn't a requirement for not turning on the fan...
It's almost as if they weren't lying when they said dropping it in the phone was a waterproofing measure. I guess people aren't dropping their laptops in pools all the time.
I think the 14" and Air might get a little warmer, but I can't recall a time I've felt heat from my 16" M4 Pro, fan sound is rare. On my 13" Intel, it was comically easy to cook my balls and the fans were at max constantly
> plastic is better than aluminium. It is robust (if done right), lighter and doesn't have good thermal conductivity (which makes laptop usage possible
> ive got a thinpad and the platic build is just terrible. The screen bends when pulling it to open the laptop
Damn. I was at IBM in the early 2000s and for many decades you used to be able to beat people to death with IBM hardware, including Thinkpad laptops and model M keyboards.
They built a reputation on that and silently replaced the plastic with crap abs. Thinkpads have been garbage since 2012. Not specs wise but build quality wise. Spec wise it’s always been a beefy machine.
> I wanted something BEEFFY! Specs wise but 13 inch at most.
One thing to bear in mind is bezels are a lot thinner than they were a few years ago.
~7 years ago, my daily driver was a Latitude E7270 - a 12.5 inch ultrabook with dimensions of 215.15 mm x 310.5 mm x 18.30 mm, 1.24 kg, 14.8 inch body diagonal
Today, an XPS 14 has dimensions 209.71 mm x 309.52 mm x 15.20mm, 1.36 kg, 14.7 body diagonal - and a 14-inch screen.
The 12.5 inch segment hasn't disappeared - it's just turned into the 14-inch segment.
The same is also true within the Macbook line. The 14" Pro is smaller and nearly 2lbs lighter than the first 13" unibodies. I have my 2009 college laptop on a shelf as a memento and it feels pretty chunky. This hasn't changed much in the M-series though, and the M5 is slightly heavier than the M1.
Something I miss from the Windows side is sub-kg machines, at least since Apple discontinued the 12" Macbook. It makes a surprisingly big difference when traveling, especially with Asian carriers that have hard carry-on limits. The Thinkpad X1 Carbon is a fantastic form factor, though the older Intel chips run incredibly hot. I repurposed that as a garage/workshop Linux machine. Unfortunately, the price differences between Mac/Windows also disappear when you start looking at those higher-end machines.
My Sony Vaio Z from 2009 or 2010 looks at your Dell in contempt: 13.1" FullHD screen at 314mm x 210mm (we'll pretend the thickness does not matter ;)) and 1.36kg. Vaio TT was even smaller footprint.
But even in 2018, you could get an X1 Carbon at 1.13kg and 323mm x 217mm x 15.5mm.
One thing Apple seems to do very well compared to other vendors is make all their hardware available in all markets on release. Companies like Dell, Asus, Lenovo, they have a confusingly large array of models, and they never release the best ones worldwide, or it takes so long to get to New Zealand that I already gave up and bought an Apple computer instead.
I, too, am a dinosaur, but touchscreens on removable screens/tablets are the way to go!
My friend, just imagine: Slide screen out of laptop, it's a standalone tablet. Connect some wires to it and you have an oscilloscope. Do some diag. Connect USB buses to it, and read some codes. Carry it around in your garage and take photos of your stuff, the images get recognized by AI and you've updated your garage inventory, it's uploaded to your Homebox running on a mac mini in a shelf somewhere. It has a built in cellular and you can be out in a park taking a picture of a baby owl, mark it with GPS, upload.
When you are done roaming the world loading in data and snapping pics, sit back down, connect the tablet to a keyboard, or even a thunderbird cable for your external display and peripherals, and write up some code or a report. Then in the evening, go play some games, all on the same computer.
You might want to actually click the links and spent a couple of minutes before typing comments. This is not a laptop with a touch screen - it's a tablet with a kickstand and detachable keyboard.
That's just a broken, compromised Windows laptop. A true "master of nothing" device. Windows is a miserable tablet OS and a tablet that uses a kickstand makes it a pain to use in desktop mode.
I accidentally got a pair of ThinkPads that happened to have touch screens, and I absolutely love the touch screen, often it's easier than the touchpad or keyboard nub.
I'm not the person you're replying to, but I do have a 64GB machine that I'd been planning to bump up to 128 right around the time the prices went through the roof. My uses are:
- VMs, I'm leaning on them more and more for sandboxing stuff I'm working on, both because of the rise in software supply chain threats, and to put guardrails around AI agents.
- Local LLMs experimentation, even pretty big MoE models (GPT OSS 120b) run pretty usably (~10 tokens/sec) with the latest tooling on a 16GB GPU and a lot of system memory.
- Even compared to a fast NvME drive, it's super nice to load a big dataset into memory and just process it right there, compared to working off of the disk.
Yeah, I have a 64GB M1 Max and can run local models pretty well. I bought it on release and even now it never feels slow. I may upgrade just because I want to move to the 14” since I travel more now.
You mean all of that running all the time is 70gb?
I tried freecad + blender with 8 mil sculpt model + prusaslicer, but that was only 11gb, so I added pycharm + steam and cyberpunk 2099 and that was 19gb.
The language server for many things I work on sits at 28gb per copy.. I work for twitch, our code base is not small for the website. Moving all engineers to min spec 48gb.
I'll do stuff all day prototyping data analysis approaches that will fill ram with a pandas cross join.
I put my4 into thermal shutdown 2x in the last month and hard locked it due to swap use 3 times in the last month. I keep records so I can talk with IT about or dev machine specs. Apparently you can't run 30 concurrent yarn builds on a 3gb codebase... Who knew.
This isn't a works on my box competition I'm glad your workloads are that small, you can be a lot more efficient than me. I'm also lucky I bought all this ram before it became absurdly expensive.
It doesn't negate that I'm constantly over 64gb and that I'm super happy I have 128+ on my machines.
most people who are into graphics processing e.g video-games, 3d for films/entertainment industry etc need these "PRO-workstation" machines, or doing fluid mechanics
if your work is around data | software engineering (web backends etc) like me - a MacBook Air tends to be sufficient
Yeah, good luck with that at current RAM prices though. DDR5 RDIMMs are going for $20/gb+ right now which means 1tb is $20k, and that's with fairly conservative pricing too.
I've been looking at building a high memory workstation recently but the RAM prices are just prohibitive. Best option atm for 1tb+ seems to be to go back a couple gen and buy DDR4, you can get 1tb at under $5/gb right now. But obviously you're giving up some performance in the process.
>I literally couldn't find anything on the PC side. I wanted an x86 because I prefer Linux Mint as my OS (didn't care about windows) , but it was impossible to find a good laptop with good GPU , more than 64gb ram and decent build materials
Maybe ROG Flow 13 ? It's more like hybrid laptop, and geared toward "gaming"(because it's usually the gamer market that demand high performance), but nothing prevents you to use it as business machine.
It's also top of the line asus laptop, so i expect decent build quality.
> decent build materials (ive got a thinpad and the platic build is just terrible. The screen bends when pulling it to open the laptop).
Thinkpads don't show off their build materials like Apple does. I've had several over the years, variously made of magnesium alloy and carbon fibre.
Screen bending is not a great metric of 'decent build'. My Thinkpads have suffered people stepping on them, being dropped etc, and I think the lid flexibility is partly why it has survived all this time - they often use carbon fibre on the back of the screen.
ASUS ROG G14 is as close as you're going to get on the x86 side of things or that new chonk of a surface with the Ryzen 395+ and 128gb of ram. both are like $2500+
I have the chonk. 10/10 would chonk again. I miss the 12" MacBook form factor for an email/web/dumb terminal machine, though. Would love something like that with great Linux support. Bonus points for cellular.
You do have 13" options, though 14" is much wider. If I was going for 13" workstation, I'd go for Asus ProArt PX13 with Ryzen AI Max 395 (if I got that right, there might be a plus somewhere) and 128GiB of RAM. They've got ROG Flow X13 with older hardware or Z13 with same hw as above, but that's a tablet computer instead.
At 14", thin-and-light gaming computers like Asus G14 or Razer Blade 14 look decent, or some of the workstation models from Lenovo or HP.
Still, for me, at 13/14", portability and battery are most important, so I am going with Thinkpad X1 Carbon atm (next gen should again allow 64GiB of RAM).
For a someone looking to switch from a M-series MacBook to a Thinkpad, which one would you recommend? Preferably not of a diminished quality, so I can daily-drive Ubuntu without missing Apple.
> Other than Microsoft nobody even makes decent laptops in the Windows world.
I get the impression that microsoft and the pc world have given up on consumer hardware and instead are completely focused on enterprise and ai. That's why windows 11 is saturated with bugs and is basically unusable, but enterprise is forced to buy it.
It definitely feels that way. Microsoft has made it clear they don't care about the consumer market anymore. Xbox is dying or already dead, they've done nothing with the game studios they acquired, Windows laptop OEMs still ship plastic 1080p crap targeted at general office workers.
They'll continue to sell it, because it's effectively free surveillance for them, but they certainly aren't focusing on the consumer market as a target demographic.
And with less and less windows-specific apps now a days, there's very little reason for the average user to buy a Windows laptop, especially over this new macbook.
Indeed they haven't, Microsoft is only one of the biggest publishers in the world, and regardless of XBox the console, Microsoft Games Studios is doing great.
What have they produced recently? I found a few lists online and looked at Wikipedia, and their big hits are all > 10-15 years old (or sequels/re-releases). Many of those are decade-old acquisitions of franchises that were ancient at acquisition.
Distribution and marketing? I wouldn't even know how to buy one of their games. I have a Linux gaming PC, a Mac, a switch, an iPhone, iPad, Apple TV and a XBone. We spend a few hundred dollars a year on video games, but I haven't seen anything suggesting any MS studio products work on any of our hardware, or are available on any distribution channels that reach any of our devices. Maybe they're on iPad, iPhone or Android? I haven't checked because we don't use those for gaming.
Windows is at ~ 95% of steam market share, so I guess that's one bright point for MS studios. However, many game developers release on Steam and console, so it doesn't imply that 95% of those other studios' customers could run a MS studio game.
Customer retention? The last time I plugged the XBone in, I spent 45 minutes screwing with bugs in the account password dialog, finally logged in, and then walked away. I unplugged it for good after the updates completed. In contrast, I spent less time than that installing Linux + Steam on my most recent PC. I guess they dropped support for XBox One at some point? I started having problems with it five years ago. I don't remember a big compatibility-break launch since I purchased it, so I'd expect it to be able to turn on + connect to their servers, or at least run the games it's already downloaded + installed.
I do own one MS game that still works: A copy of Minecraft. It took over 8 hours to figure out how to get it stop constantly asking my kids for my master MS account password. That did convince me to actually wipe all data from my Microsoft account, so I guess it was a win.
There are some decent-looking AMD + nvidia laptops from Razer. No idea if they run Linux well, or are reliable, but they seem to tick all the spec boxes. For instance, they have a higher resolution than the monitor I owned in 2001. (3200 × 1800 @ 120Hz minimum on their 14"); probably OK battery life if you don't use the discrete GPU.
From what I've seen of Win 11 in VMs, it doesn't seem compatible with the phrase "decent laptop".
>That's why windows 11 is saturated with bugs and is basically unusable
That's far, far from my experience. What bugs are you talking about that make it "unusable"? I've been on Win11 for years and it's been no problem at all. No bugs that I can think of.
You must be lucky. They have been well-documented. [1]
The constant, annoying reminder to sign up for One Drive is enough to drive me crazy and want to throw my device out the window (I am writing this from a windows 11 laptop that I use for experimentation).
Apple seemed to copy this one exactly as iCloud asks you the same all the time. Honestly these days Linux feels like the only sane platform as you can customize it properly.
I am a big fan of the command line, but running linux as my daily driver is like trying to daily a kit car -- it breaks all the time and i spend more time than i want fixing it. With macos, i get my beloved command line, nice hardware, and a reliable OS. Win win win.
> I am a big fan of the command line, but running linux as my daily driver is like trying to daily a kit car -- it breaks all the time and i spend more time than i want fixing it.
Linux powers the entire world. Billions if not tens of billions of devices. It doesn't "break all the time like a kit car". I switched my wife's desktop from Ubuntu to Debian about a year ago and I haven't heard a single complain. Not a single crash. She hardly reboots her computer. The thing is just rock solid and it needs to be: she works from home and she spends 8 hours+ on her (Linux) computer.
Fair. Last time I tried to daily Linux was 2016 with a crappy dell I had laying around, and I am pretty sure that I did not know what I was doing. I have been on Mac since 2012 and I tried windows in 2019 only to regret it, then went back to mac.
That is also far from my experience. I'm starting to think it's more about you than about the tech. I have 5 machines running Linux, and they never break (1 server and 4 VMs). I have 4 machines running Windows (3 physical, 1 VM), with zero problems for many years.
that really doesn't make sense to me. You need to have people use Windows in everyday life so that they don't need to be retrained when in the workforce for Windows to keep its stranglehold on the enterprise.
It really isn’t. The track pad on surface is terrible compared to Mac. The surface has some weird edges and other spots to get caught on. I’ve seen a few with serious damage from typical daily use. The surface I have is barely hanging together, the charger is extremely finicky and will stop charging randomly. It takes effort to get the charger to “sit” in the slot and make contact.
That said, my surface is pretty old so maybe some of these design flaws have been fixed.
But from my experience, the build quality of the MacBook is in a different league than the surface.
telling that this is flagged 1 minute into submission.
Microsoft hardware was in the premium tier for sure (and continues to be: relative to others), but these days nearly all the OEMs have pretty bad warts across the line-up, even the surface books, even the new ARM ones (which are quite good).
For work I have a Thinkpad T14S (ARM also) and it is a better quality notebook than the Surface book others in my organisation have (those feel like a 95%-ish imitation of Macbooks, the only variations being strict downgrades in their respective areas).
So I'd push back on the idea that nobody is making good Windows computers, but it seems to be fewer and fewer, and the big brands like Dell Latitude and HP Elitebook are also dropping the ball for a long time now.
Dell Pro Max, I think the Latitude line disappeared. But I feel Lenovo is the last one too, the only brand I trust for a Windows or Linux machine these days. I like Apple hardware and have my reservations with macOS, but it is still better than Windows.
IMO "build quality" is not the right term here. At least to me, "build quality" refers to how evenly examples are made and how close the real world examples adheres to manufacturing blueprints.
If finishes and gaps are tight, all around bodies and across examples, the build quality is GOOD. If every units looked slightly different and some were outright broken straight out of the box, then the build quality is BAD. Even if they were worthy of included in the MoMA collection.
Both Microsoft and Apple(or their paid Chinese outsources) are top notch. Every units looks the same and flats on the bodies are really flat. Industrial design and usability, like sharp corners and fugly aesthetics, are different issues entirely.
You're right. "Build Quality" isn't the right term.
Maybe "Overall Quality" or "Device Quality" would work better. The point is that my MBP has held up MUCH better over time than my Surface, which is barely able to charge at this point.
Manufacturing tolerance is the term for "how close are they all to being the same shape?" Good tolerances are usually a prerequisite to good build quality, but not always.
For instance, cast iron pans can have poor tolerances (be off by fractions of inches), but, as long as they're not warped, and the metallurgy is solid, they could last centuries, and people would say they have good build quality.
On the other hand, a stainless steel pan that's volumetrically-perfect, but has faulty internal welds on the laminated bottom could fall apart after a few uses due to heat strain snapping the welds. That'd be terrible build quality.
The camera on the Surface is nowhere near as good as on my M1 Macbook Air, either. That seems to be a weird blind spot on laptops in general, it's very obviously an afterthought on my personal Dell XPS as well.
Taking the laptop to the office and back home again daily. The hinge has gotten weak over time. The connection to take off the screen is very fragile, tapping the button to enable removal only works about half the time. Then, when re-attaching the screen sometimes it doesn't catch, or the keyboard connects but doesn't realize it is connected so the machine stays in tablet mode. The trackpad has gotten spongey and harder to click.
It didn't happen to me, but of the 4 people in direct team that had them, 2 had battery issues where the battery expanded making the laptop unusable. *Edit: This was covered under warranty, thankfully
This is from approximately 2 years of daily use for work. I no longer use my surface.
I typically care for my laptops very diligently. I still use my MBP from 2012 and it works like a champ. I don't have a windows laptop anymore, but my main desktop is windows. I'm not a Mac fanboy.
Your Mac "fanboy" nonsense is tiring. The Surface 7 Laptop is an excellent machine, built well, and even gets a good iFixit rating (4 screws and you can replace the battery and M.2)
I'd rather have a thousand form-factors and build qualities to choose from than the one-size-fits-all that Apple offers. If Apple doesn't make it, then you can't run their software on it, and they don't make too many form factors.
I can run Windows on a USB stick form-factor if I want to. Or dozens of tablet sizes from various vendors. And every kind of laptop imaginable, with all kinds of features. And everything else up to massive rack-mount server hardware. But sure, if a Macbook is all you need, then go for it.
This is not primarily competing with the surface line of laptops, this is mostly competing with chromebooks which dominate schools. That's a completely different segment of devices.
I am in education and speak to others at the (US) national level on a near-daily basis. This doesn't compete with Chromebooks in schools at all.
- Chromebooks in EDU cost approximately $290 (+- $10) per unit.
- The Neo costs $499 per unit for schools.
- For the cost of 10 Neos, I can buy 17 Chromebooks. Yes, this is a numbers game. The goal is every student has a device.
- Schools using Chromebooks to log in. If you want reliable Google logins on macOS, you have an additional big spend up front, along with per-seat licensing costs.
- This doesn't even factor in MDM and app cost comparisons.
While I was in high school, as a punishment for a fake "hacking" prank [0], I had to spend half of my lunch breaks with the school IT guy for two weeks as he went around and fixed the damage students were doing to the school computers.
There wasn't actually THAT much going on and we mostly just sat in his office and chatted, but when he did have to deal with something, it was absurd the vandalism that happened. One kid had unplugged a mouse and managed to jam the plug into the floppy drive. The IT guy was like "It takes talent to be this much of a piece of shit" as he had to disassemble the case to get it out.
When it comes to issuing laptops to public school students, I'm torn. On one hand, people need computer skills, but on the other, I just don't think many students can be trusted with a piece of equipment costing hundreds of dollars. Hell, how many people can't even own a personal cell phone without somehow shattering the screen in just a few months?
[0] I had created a two-page slide deck with a black background and white text, then filled both slides with the same text that made it look like a DOS prompt and that Windows had been deleted. It had a C:\> prompt, and on one slide, there was an underscore after the prompt. I then made the slide show auto-play and loop, making the underscore blink, which made it look like an actual prompt. Keep in mind, these were Macs. There was no "C" drive, and certainly no Windows. A teacher insisted I broke the computer, despite showing that pressing any key ended the show, took me to the Principal's office, who gave me my punishment. My first time talking to the IT guy, he was like "you did what now?" and I showed him, and he thought it was funny as hell. Honestly, my "punishment" ended up being pretty fun. That was all 25 years ago. I wish I remembered his name so I could look him up.
I’m old, so all our computers were in the library or lab. But, kids will be kids. We would stick paper clips into the sockets to see if we could trigger the breaker, among other idiotic things.
Not just durability, but ergonomics. My kids have crappy screens, literally the worst trackpad I’ve ever used, and awful keys that hurt my hands minutes after typing (but I go all day on my personal computer).
If schools are found to be neglecting a minimum standard of care by subjecting kids to hardware that causes long term physical issues, they would have wished they would spend a little more (it amortizes to about $20/student year difference the way our school district does it).
Somehow while spending the most per capita of any nation on the planet, American schools are in a perpetual budget crunch. It's about getting internet access not whether the trackpad is good. You think a chromepad is crappy - have you ever tried to do something in Blackboard?
> If schools are found to be neglecting a minimum standard of care
They won't be. Pizza sauce is considered a vegetable.
An aside: Why do school board super-intendants and administration make more money than teachers themselves? I believe they shouldn't.
> Why do school board super-intendants and administration make more money than teachers themselves?
The more and less cynical explanations (and both play a role, IMO):
(1) Because individuals in those roles have closer relationships to the people that set the salaries than do individual teachers, and
(2) Because otherwise people with experience in education would continue as teachers and not seek roles as superintendents or other administrators (or seek the advanced degrees sought for those roles whose only financial payoff is greater competitiveness for those higher paying roles.)
> Why do school board super-intendants and administration make more money than teachers themselves?
A couple reasons:
1. Because usually, superintendents and the administration are responsible and accountable for a lot more moving parts than teachers are. Aside from the many kids each teacher teachers, which leads us to point #2.
2. There is a lot more supply of teachers than demand. If a teacher doesn't like their objectively meager pay, they can quit. There are 10 applicants lined up waiting to take their position.
> I believe they shouldn't.
This is generally handled at your city level. Organize your like-minded constituents to lobby the board?
Our district had BYOD and just got rid of it this year. We used it because the teachers couldn’t manage keeping kids off games or YouTube on their Chromebooks during class. Even then, personal devices could not be used for state testing.
Chromebooks don’t have a durability problem. I doubt the MacBook is any more durable, even with an all metal construction - if anything, that probably makes it worse at absorbing impact than nice soft bendy plastic.
This is just how students treat laptops, and a more expensive unit only makes the problem worse.
Actually metal's pretty bendy when compared to plastic (most anyway...mmv based upon formula).
The metal construction is what prompted me to switch over to macbook pro's back in the day. The plastic dell laptops i used to use couldn't handle the abuse that it took during all of the travel i was doing at the time (cases kept cracking). I switched to a pro and was rewarded later with it surviving a 5 foot fall from a car rental counter. It bent part of the corner, but the screen was still in tact and it continued to work well enough to get me through the trip. I suspect the plastic alternative would have been toast.
Having kids today and seeing how rough they are with their toys, I'm not confident that a plastic laptop would survive them long.
Metal is technically more elastic than an elastic band.
With a Young’s modulus of 69 GPa for aluminum versus just 2 GPa for ABS, metal has the "memory" to snap back from significant pressure. Plastic, true to its name, is far more likely to hit its limit and stay permanently deformed. (That’s why metal bars are used to provide “flexibility” to buildings. Concrete provides the strength)
Kids are given those for free, so there's no responsibility for them to keep them in good condition. It would take a restructuring of laptops within the school system to kids/families having a joint ownership over the laptop to stop them intentionally destroying them. Even then, there are complications like kids that will absolutely destroy anothers' for fun.
And knowing how laptop makers treat keyboard repairs, the keyboard switches are easy to damage beyond repair and expensive to replace, making them a target for "problem" kids in school districts with a dysfunctional penal system.
My kids have (insanely shitty) chromebooks from school and we are absolutely responsible for the cost if they break. We have to sign a release at the beginning of the year. Whether or not they’d be able to collect from the vast majority of families is a different question, granted. But the responsibility is there.
In practice, there's a huge difference in responsibility between buying and sending your kid with a laptop and signing a paper that says you're responsible if it breaks. I'd also guess it depends on where you go to school.
My child's school provided Chromebook was broken from the beginning, so clearly they're not paying that much attention.
> Kids are given those for free, so there's no responsibility for them to keep them in good condition.
Very often they aren't (the school devices are in-school resources that aren't given to the kids any more than their desks are) and anything the kids have out of school is bought by the parents (and even if they are given the computers by the school, usually the replacement costs is on the parents if there is damage). But, either way, grade school kids are, on average, irresponsible as a matter of cognitive development (its a big part of why children are treated differently than adults legally.)
> school districts with a dysfunctional penal system.
A school district that can be described as having a “penal system” is, ipso facto, dysfunctional.
Who pays for the laptop when the school bully pours water on a kid's backpack? Or a kid has their bag in a seat and someone sits on it accidentally?
What happens when a kid's laptop is broken, regardless of the reason, and the family is unable to afford to repair it? Are we going to run into a similar situation that we had when kids couldn't pay for school lunch? Do teachers write "pay for a new laptop" in sharpie on the kid's arm for the parent?
A child's educational environment is a lot more chaotic, violent, and uncontrolled compared to an office environment. If you're issuing my child a $600 laptop and making me responsible for any damages, guess what's going to be kept at home in a secure location?
Making a child responsible for securing a laptop in an insecure environment isn't accountability, it's just a form of imprisonment.
What happens when a backpack full of paper books is destroyed? When I was a kid, we were charged between $50-100 for a book that was written in or destroyed. I bet these days it would be $200 each. Yeah we were running around with $500-600 of books in our backpacks all the time.
Back in the day it was also our (kids/parents) responsibility to provide book covers. We always used paper grocery bags, but you could buy some that were purpose built.
I don't think institutions will care much about the enhanced durability since they treat laptops as disposable units anyway. Apple can only complete if they provide bulk deals which bring the overall cost in line with chromebooks.
No, we really care about durability. The amount of damage is crazy. So many units are damaged that it would be cost-prohibitive to dispose and replace them.
The screenshot in that Reddit post more or less looks like ours. Schools generally repair these, if they have the technicians. And everyone is cannibalizing parts out of last generation models. It's like a Jawa shop.
> Apple can only compete if they provide bulk deals which bring the overall cost in line with chromebooks.
I've never seen, nor heard of Apple providing competitive prices, even in quantities of ~10,000 units. They haven't even gotten close and they've largely given up on the idea of Macs as a standard K12 school device. ~$250 iPads are still strong in low primary grades and special education, though.
Can confirm Apple gave up on education. If they really cared you'd be able to have multiple accounts/profiles on iPad, and that's still not a thing that exists.
I did a major PTA fundraiser to buy iPads for our classrooms and they were pretty much never used because of this.
> you'd be able to have multiple accounts/profiles on iPad, and that's still not a thing that exists.
It does exist, it just requires the iPad to be managed via MDM, which most schools would have (and should implement if they don't have it). JamF, Mosyle, Business Essentials, InTune and probably any other MDM can put an iPad into shared iPad mode with multiple profiles.
I could see the Neo as a viable option for teenage students. The high school I attended distributed a Chromebook to each student and hardware faults were far more common than student inflicted damage. Low build quality in everything from the hinges to the logic boards. Most students feared seeking a replacement device when theirs would break without having done anything wrong. A device with higher build quality and software longevity has the potential to save these institutions a reasonable sum of money in the long run.
Younger students on the other hand, Chromebooks remain the way to go. Most of the time, kids'll win in a race between their destructive tendencies and crappy hardware giving out.
> I could see the Neo as a viable option for teenage students.
100% agreed. My statements weren't meant to indicate the Neo wasn't viable. They were meant to state that the Neo isn't going to replace Chromebooks in schools (as far as being District-purchased).
> The high school I attended distributed a Chromebook to each student and hardware faults were far more common than student inflicted damage. Low build quality in everything from the hinges to the logic boards.
Build quality has been steadily improving over the years. It's all still budget (target ~$290), but is more and more durable with each new generation.
Even so I imagine your average person needing something for education would consider both. The Neo may cost more but from my past experience of Apple stuff they will likely be better made.
Certainly possible. But, in the US consider that Google and "one-to-one" Chromebooks are generally dominating and the curriculum more or less requires extensions and setting.
As an example, my kids try to do school work on one of the house Macs, but there's too many roadblocks so they just use their Chromebooks.
I used to buy my kids Chromebooks for school, but, since the pandemic, the school issues them, so I haven't bought any since.
> Apple stuff they will likely be better made
It depends on what you mean. Apple uses higher quality parts and is more sleek.
Chromebooks are more durable, take more abuse, are very repairable, and parts are cheap and plentiful. These are keys to schools. We're at a point where schools cycle out old models and either keep a bunch around, or strip parts from them, because some parts are interchangeable between generations.
> This doesn't compete with Chromebooks in schools at all.
Of course, it does. The price difference is small enough now that the Neo is in the running. There's no doubt the build quality is going to be much better than a Chromebook.
I worked in education for 20+ years; that $499 is just the starting price; a school or school district that buys them in quantity is going to get an even better price.
Sure, a Chromebook is better than nothing, and if you’re an impoverished school district, you may have no choice but to go with Chromebooks. But if there's an opportunity to get Macs at this price point, most school districts are going to take it.
Don't underestimate Apple's sales and support infrastructure. Many of the schools in the US are in areas with Apple retail stores, where sales and support work out of.
It's hard to imagine a school committee going with Chromebooks instead of Mac Neos for a little more money and likely better support. The parents aren't IT experts, but they know Apple is a trusted brand, and Macs are "better".
>a school or school district that buys them in quantity is going to get an even better price.
Citation? I've read and heard others say this is the opposite of the truth, that Apple never gives bulk discounts. Heck, there's someone in this very discussion saying the same with actual prices paid in their comment showing real first hand experience and yet you come in here with a hand wavy unsupported claim that Apple gives breaks for bulk buys.
> I've read and heard others say this is the opposite of the truth, that Apple never gives bulk discounts.
When I worked in higher education at an Ivy Plus university for 14 years, we were able to get discounts. We also had a campus store where we sold and repaired Macs.
I understand that higher ed is quite different from K-12. It also seems like sales reps have much less leeway now than they used to. I have no doubt there's been multiple reorganizations, etc. and things could be completely different now.
It's not just discounts; even if you pay the normal education price, Apple could throw in some extras (software licenses, AppleCare, etc.) to sweeten the deal.
So what segment does it target in your opinion? The "surface" market is minuscule and compared to the edu market irrelevant, the "vendor lock in" angle with the google logins can easily change over night as it did with microsoft.
- People who have a desktop computer, but want a cheap portable for on-the-go.
> The "surface" market is minuscule
Probably so, but then again, I see a lot of Surface devices out and about and they are fairly popular with non-teacher education staff. While they aren't competing with Chromebooks or Apple on volume, I'd bet they're doing well.
If the school is wealthy enough to provide free laptops, then you're right they're going to go for the cheapest option. But if the school expects the parents to provide laptops, then the parents are more likely to choose this.
The better question is why on earth do school kids need computers, especially anything beyond a robust desktop PC that they use extremely sparingly? No one ever seems to be able to give a good answer as to why we do this to ourselves, especially elementary school kids other than parents/teachers/schools are some combination of overwhelmed, lazy, or just outright massively derelict.
Not to disagree too much with your assessment, one point stands out:
> The Neo costs $499 per unit for schools.
We don't actually know this. It does at the level individual student purchasing themselves, but I'd imagine there is a substantial bulk discount for educational establishments. That is not a new trend for Apple, it dates back to the Apple II.
We do because this is historically the norm. Schools pay roughly the same as the "college student" pricing, aside from the occasional deals they toss us.
I reckon even an iPhone pro is better value than an average android phone. Same with iPad vs Android tablet.
Because they last 3 possibly 4 times longer. A decent Apple laptop purchased 4 years ago is still basically a top notch laptop. Build quality is amazing. Resale value is still very high.
> If you buy Android flagships after 2022, they also last 4-6 years.
The hardware lasts but they usually stop getting software updates after a few years, especially if they're not high-end models.
Last month, Apple released an update for the iPhone 8 and iPhone X [1]. The iPhone 8 was released September 2017. I seriously doubt 9-year old Android phones, even flagship models, are still getting software updates.
> Last month, Apple released an update for the iPhone 8 and iPhone X [1]. The iPhone 8 was released September 2017. I seriously doubt 9-year old Android phones, even flagship models, are still getting software updates.
How usable is an 8-year-old iPhone as a primary phone though? I agree that having 8 years of support is a good thing, but at that point the hardware is so degraded that it's not suitable for its original purpose anymore. At that point I'd rather have android just so I can root it and install Linux. Then again, with improvements to phones slowing down in recent years, this is becoming increasingly untrue.
Samsung's flagship Galaxy series get software support for 7 years. A, M, F mid-range and low end models get 6 years of software support. The worst case today for the most popular mid to low spec phones is twice the "a few years" you claim, which suggests you're out of touch with the changes in the industry over the last few years.
> The worst case today for the most popular mid to low spec phones is twice the "a few years" you claim, which suggests you're out of touch with the changes in the industry over the last few years.
Are you sure about that? Apparently nicer Android phones not getting updates for very long is real.
No Longer Receiving Updates
Google Pixel [2], [3]
| Phone | Released |Updates Ended|
|---------------|----------|-------------|
| Pixel 3 | Oct 2018 | Oct 2021 |
| Pixel 3 XL | Oct 2018 | Oct 2021 |
| Pixel 3a | May 2019 | May 2022 |
| Pixel 3a XL | May 2019 | May 2022 |
| Pixel 4 | Oct 2019 | Oct 2022 |
| Pixel 4 XL | Oct 2019 | Oct 2022 |
| Pixel 4a | Aug 2020 | Aug 2023 |
| Pixel 4a (5G) | Nov 2020 | Nov 2023 |
| Pixel 5 | Oct 2020 | Oct 2023 |
| Pixel 5a | Aug 2021 | Aug 2024 |
As of late 2024, the Pixel 3, 3a, 4, 4a, 5, and 5a series are all fully out of support. The Pixel 5 received Android 14 as its last OS update with a final security patch in October 2023, and the Pixel 5a concluded support in August 2024, also on Android 14.
Samsung Galaxy [1], [4]
| Phone |Released | Updates Ended|
|----------------------|----------|--------------|
| Galaxy S9 | Mar 2018 | ~2022 |
| Galaxy S9+ | Mar 2018 | ~2022 |
| Galaxy Note 9 | Aug 2018 | ~2022 |
| Galaxy S10 | Mar 2019 | ~2023 |
| Galaxy S10+ | Mar 2019 | ~2023 |
| Galaxy S10e | Mar 2019 | ~2023 |
| Galaxy Note 10 | Aug 2019 | ~2023 |
| Galaxy Note 10+ | Aug 2019 | ~2023 |
| Galaxy S20 | Feb 2020 | Early 2025 |
| Galaxy S20+ | Feb 2020 | Early 2025 |
| Galaxy S20 Ultra | Feb 2020 | Early 2025 |
| Galaxy Note 20 | Aug 2020 | ~2024–2025 |
| Galaxy Note 20 Ultra | Aug 2020 | ~2024–2025 |
| Galaxy S20 FE | Oct 2020 | Mid 2025 |
| Galaxy Z Fold 2 | Sep 2020 | ~2024 |
| Galaxy Z Flip | Feb 2020 | ~2023 |
The Galaxy S20, S20+, and S20 Ultra received their final update in the form of the January 2025 security patch. After originally launching in 2020, Samsung had promised four years of software support for the S20 trio — three major OS upgrades (Android 10 to 13) and four years of security updates.
On Their Last Legs (Security Updates Only, No More OS Upgrades)
These are still receiving quarterly security patches but will drop off soon:
- Galaxy S21 / S21+ / S21 Ultra — Final OS was Android 15; now on quarterly security patches only
- Galaxy S21 FE — Will receive Android 16 as its final major upgrade via One UI 8, after which it moves to quarterly patches with no further OS updates
- Pixel 6 / Pixel 6 Pro — Now updated to a 5-year support window, with final updates expected in October 2026
- Pixel 6a — Supported until at least July 2027
---
The main takeaway: if you're on a Samsung S20-era or Pixel 5a-or-older
device, you're fully unprotected. The Galaxy S21 series and Pixel 6/7
families still have some runway left, though they're winding down.
Physical durability will play a major factor here. If schools are expected to provide the Chromebooks then it will all boil down to the level of abuse/neglect the hardware can handle.
Replacing a low-resale value $250 Chromebook that is equally sensitive to being dropped, exposed to liquids, or having debris get into hinges and keyboards will be heavily favored over a $500 MB Neo. The Neo’s processor and storage may have better lifetime but it doesn’t mean anything if the equipment ends up bricked.
Schools in affluent areas may favor these for reasons you state. Judging on how students treat textbooks though should demonstrate how short the lifespan would turn out to be.
Framework might be appealing as well. Being able to have parts on hand that can easily be swapped out sounds a lot better/easier than dealing Apple repair practices. The Framework Laptop 12[0] starts at $549 and has touchscreen/pen options. But that price goes up to $799 to have it pre-built with an OS on it, which schools would want, unless building your laptop and installing the OS is part of the curriculum. I wonder if having the kids do this would make them take better care of it, because they had a hand it making it?
somewhat off topic, but I really am not sure that adding chromebooks to every school has made education better. hard to block youtube when they bring these home (I know you can, but the average person can't).
The only problem with Chromebooks and the whole Google educational toolchain is it ruins school!
My kid is on it, every kid hates it and every teacher hates it. You just can't argue with the pricing. I'm amazed at how bad everything seems to old fashioned paper text books.
Every time I help my son I'm amazed how bad it all is. Horrible tiny screen that looks like is from 2000 and then the software is all designed for some Googler who has 2x 30" 5k displays. The usability is atrocious.
Chromebooks are the SaaS of hardware where the user is not the buyer. No one says “I would love to have a Chromebook at home” any more than they desire to run Salesforce at home.
A Chromebook at the same price point will get you similar if not better specs, 14" 16:10 FHD IPS display, convertible with touchscreen and pen input, backlit keyboard and 10h+ battery runtime.
This is not competing against Chromebooks, which have very little reach outside of institutions. The Macbook Neo will likely have very widespread appeal for anyone looking for what used to be a netbook.
It could compete well in both. Looks like Apple has a product that competes with Chromebooks on price and competes with Surface on performance at the same time. At least close enough on both counts to create headaches for anyone trying to sell either.
Pen input is the one factor that forced one of my kids to a Windows laptop for school (a Surface Pro). It was a required feature for his school. Seeing how much he uses it for note taking, I get it. So yes, drawing is a key feature for schools.
Another school uses iPads with keyboards for the same purpose, so I'm not sure where the school market is for these. Maybe only older kids, but a lot of edu-tech is expecting some kind of touch/pen input.
It might be a fine laptop when you are on the move. I have an educational Lenovo for that purpose, but I would appreciate a Mac for that same use. When I need more power away from my desk I can use the MacBook Pro or my Lenovo T series (both a lot heavier than I’d like).
I just wish the Mac had 16GB of RAM but my tiny Lenovo has 8 and it’s been working OK so far - I haven’t even set up a proper swap partition and it’s running on zram.
At this point I think few people really will care about that spec difference.
The accumulated brand trust of Apple, and the negative brand trust of Microsoft outweighs the numbers.
Even many technically savvy people believe Apple can deliver a higher quality computing experience with 8GB of RAM than Microsoft can with 16GB, and they're often correct.
This is an important thing to Apple, and Apple users know it. They would not have put out this macbook if it was going to be a subpar experience. Microsoft has no such qualms about OEMs shipping an underspecced disaster of a beater laptop (see Vista).
You can (generally) but any Apple product and know you are going to get something quality and a good experience, even from the base/budget models. They don't really have any "bad" products.
And despite antenna gate, the iPhone 4 was still the best smartphone of that year and leaps ahead of it's closest competition (the Galaxy S), and remained the #1 best selling smartphone at year after launch
You can only buy hardware that runs Apple software from Apple, but Android mobile devices far outsell Apple devices and always have. Apple is and always has been a minority player in the overall smartphone market (and desktop/laptop as well).
Globally, Android has had about 70% to 75% market share, and Apple has always had a much smaller slice of the total. iPhones are not as popular as you seem to think they are. You don't have to believe me, the data proves it:
Sure, but that doesn't change the fact that the iPhone 4 was the single most purchased smartphone model in the US between 2010 and 2011 (during antenna gate that we are talking about).
Android has the majority share because "Android" is anything from a $100 piece of junk to a $1200 phone. If you look at only the premium market, Apple holds ~70% market share.
Despite antenna gate, it still sold plenty, which proves the point about brand trust that the thread was about.
If the brand equity wasn't there, the Galaxy S would have out sold the iPhone 4, but it didn't, it sold half as much.
>Sure, but that doesn't change the fact that the iPhone 4 was the single most purchased smartphone model in the US between 2010 and 2011
Are you trying to give Apple some kind of tech participation trophy? Because that's all you're doing.
>If you look at only the premium market, Apple holds ~70% market share.
Sure, Apple is a luxury brand, and so not many people can afford it. Nor should they be spending the ridiculous amount of money Apple normally charges.
>Despite antenna gate, it still sold plenty, which proves the point about brand trust that the thread was about.
Reality distortion field still in effect in 2026.
>If the brand equity wasn't there, the Galaxy S would have out sold the iPhone 4, but it didn't, it sold half as much.
I don't care about brands as much as you seem to, that much I'm sure about. Your precious Apple could never do you wrong, we get it.
> but Android mobile devices far outsell Apple devices and always have
"far outsell" is doing a lot of work in that sentence.
The iPhone has a market share of 60% in the US [1]. The leading Android manufacturer Samsung has a market share of 22% in the US.
These numbers are from last year; the iPhone sold like hotcakes in the European 5, the US (of course), Australia, Mainland China and Japan [2].
BTW, the European 5 consists of Germany, France, Italy, Spain and the UK.
Apple by itself globally makes up about 43% of the revenue in the smartphone market [3].
Yes, devices running the Android operating system sell a lot of units; the majority of them are no-frills devices from manufacturers most people have never heard of. Which is fine—having a phone is better than not having one.
But don’t act like Android is some kind of juggernaut; these five markets represent 2.24 billion people and 60% of the world's GDP. Android isn’t the bestselling phone in any of these countries.
# Top Selling Models
European 5
| Rank | Model |
|------|--------------------|
| 1 | iPhone 16 Pro |
| 2 | Samsung Galaxy A55 |
| 3 | iPhone 15 |
| 4 | iPhone 16 |
| 5 | iPhone 16 Pro Max |
US
| Rank | Model |
|------|-------------------|
| 1 | iPhone 16 Pro Max |
| 2 | iPhone 16 |
| 3 | iPhone 16 Pro |
| 4 | iPhone 15 |
| 5 | iPhone 14 |
Australia
| Rank | Model |
|------|-------------------|
| 1 | iPhone 16 Pro Max |
| 2 | iPhone 16 |
| 3 | iPhone 16 Pro |
| 4 | iPhone 12 |
| 5 | Samsung Galaxy A35|
Mainland China
| Rank | Model |
|------|--------------------|
| 1 | iPhone 16 Pro Max |
| 2 | iPhone 16 Pro |
| 3 | iPhone 16 |
| 4 | Huawei Mate 60 Pro |
| 5 | Huawei Mate 60 |
Japan
| Rank | Model |
|------|--------------------|
| 1 | iPhone 16 |
| 2 | iPhone 16 Pro |
| 3 | iPhone 15 |
| 4 | iPhone 14 |
| 5 | Google Pixel 8a |
Cute that the Apple fanboys constantly want to make this about a brand, and not a platform, because the Apple platform is very low ranking in the larger world of Smartphones. So you will literally redefine the conversation just to give your favorite company a participation trophy award.
> want to make this about a brand, and not a platform, because the Apple platform is very low ranking in the larger world of Smartphones.
Let me get this straight: you believe the iPhone "is very low ranking in the larger world of Smartphones" even though it's the most popular and best selling smartphone in the five largest economies on the planet.
I posted the 5 top selling smartphones in the European 5, United States, Australia, Japan, and China—out of 25 models listed, 80% (20 out of 25) were iPhones.
Don't hate the player, hate the game. No matter what you believe, the number are the numbers:
- Apple’s iPhone marketshare in the US is 60% vs Samsung’s at 22%
- the iPhone alone brought in $209,586 billion in FY 2025 [1]
- if the iPhone were its own company, it would be #9 on the Fortune 500
- Apple's iPhone revenue is greater than the revenue of Dell, Hewlett-Packard, Intel and AMD combined.
>"Apple’s iPhone marketshare in the US is 60% vs Samsung’s at 22%"
Which iPhone, which Samsung?
And you're cherry-picking the US market only.
Worldwide, Apple's market share sucks. Oh, but I guess the rest of the world doesn't matter to you as long as the numbers make sense in your own head that Apple is somehow "winning".
Apple has never had and never will have the market share that others have - Windows and Android eclipse Apple's 15%-30%. Those are the numbers you're so desperate to avoid acknowledging.
It's a pretty pathetic display of fanboyism, and it's rather boring - this "conversation" is over.
They say reading is fundamental; you might want to practice to get your comprehension up.
I literally provided the top selling smartphones in China, Japan, Australia and a group of 5 countries in the European Union. The iPhone topped the sales charts in all of them.
> Which iPhone, which Samsung?
All of them? The total of all the iPhone models sold in the US was about 3x the total of all the Samsung models sold here. That’s the 60% vs 22% difference I mentioned earlier.
> Those are the numbers you're so desperate to avoid acknowledging.
Nobody disputes Android’s 72% global market share vs Apple’s 27%. You can calm down now. ;-)
To simplify things for you, Android dominates in developing countries in Africa, Asia, Central and South America. For example, Android has 95%(!) of the market in India, which is ironic since iPhones for the US are made there now.
It goes without saying iPhone does much better in more affluent countries. So does Samsung.
> It's a pretty pathetic display of fanboyism, and it's rather boring - this "conversation" is over.
When someone isn’t doing so well in a debate, they resort to insults and name calling. Sad.
It’s not that your “opinions” are worth responding to on their merits—they’re not.
I’m writing for readers that might come across this thread and learn something they didn’t already know.
Sorry you wasted your time writing something that I won't read, but I told you, this conversation is over. You didn't "win" here, you only made yourself look like a pathetic, desperate fanboi.
Apple certainly puts out experiences that leave much to be improved but to be pedantic the word 'subpar' implies below the 'par'. If 'par' is set by Microsoft then Apple easily clears it.
Nowadays Chromebooks offer more design competition for Apple, and even historically Linux distros have had more ideas for Apple to learn from than Microsoft.
>If 'par' is set by Microsoft then Apple easily clears it
That's clearly subjective. What you will accept from Apple is unacceptable to others as garbage, the same as you dismiss anything from Microsoft.
>Linux distros have had more ideas for Apple to learn from than Microsoft.
And yet Apple just copied Windows Vista with their "glass" monstrosity that is universally hated and has been lambasted widely. Again, you may love that, but that would put you in the minority.
Obviously it's a subjective discussion but it's still a meaningful subjective discussion.
I was deeply into Microsoft products for a while. I got my start coding an indie game for the Xbox, I spent years using Windows Phone and developing an app for the platform, I interned at Microsoft twice and then later worked there as a software engineer for a period.
While there I did my best to improve the product I worked on, and I went beyond what most engineers do when thinking about product quality. I would gently and politely email other product teams with bugs or minor product issues that I felt were low hanging fruit. On my own team I was often one of the stronger advocates for the user and for product quality, and sometimes I got pushback for it.
My opinion about Microsoft's product culture is not formed lightly.
I don't believe Apple is faultless, but I think they demonstrate far more awareness of how their product decisions accrue to a lasting brand. It's not just marketing spin, it's real actionable decisions over decades that accrue to brand perception.
>While there I did my best to improve the product I worked on, and I went beyond what most engineers do when thinking about product quality. I would gently and politely email other product teams with bugs or minor product issues that I felt were low hanging fruit. On my own team I was often one of the stronger advocates for the user and for product quality, and sometimes I got pushback for it.
You've described every company I've ever worked for. I guarantee that Apple does not work any differently.
>I don't believe Apple is faultless, but I think they demonstrate far more awareness of how their product decisions accrue to a lasting brand.
You're wrong about this, as evidenced by their "glass" debacle. I mean you didn't respond to my comment about that at all, and it's so glaring obvious how bad and pointless "glass" was. Nobody wanted it, nobody needed it, and it made things objectively worse. That wasn't a display of product design acumen, it clearly exposed Apple's flaws in very public fashion.
Apple's "glass" UI update debacle should be evidence enough to quash any argument you could make. Their current performance leaves a lot to be desired, everyone hates "glass".
I like it. Debacle isn't the word you're looking for. "Some loud people on the internet don't like it and the user base has largely been ambivalent towards it. In reality, it's rough around the edges and needs some work."
The VP by most accounts is best in class. It’s just too damn expensive. There’s also still an open question if people really want to strap goggles to their face.
I bought mine for air travel, when I can strap it to my head for 12 hours and be in a completely different place. I can lie back in my lay-flat seat, so there’s no weight pulling my head down, and it’s an absolutely fantastic experience.
I fly sufficiently that this is well worth it. The fact that it doubles as a mobile computer in the hotel room is just icing on the cake.
So subway ? Maybe not, but don’t pretend they don’t have their own niche…
Apple's UX quality, design focus, and respect for its customers is higher quality and more consistent than Microsoft's.
Apple is also imperfect and I feel leaves tremendous room to do better, but they are still much better than Microsoft.
Take one topic: UI refactorings. Apple has rolled out disruptive UI refactorings but they've also rolled them out consistently across products and throughout their software.
Microsoft did not have the internal leadership discipline or commitment to design to ever get their products in alignment around a design language. It is common on Windows that the included software all uses different design toolkits and design paradigms. For years Windows was infamous for having multiple ways to configure even common settings, often requiring falling back to the old version, because they were not able to ship a unified UX.
Microsoft routinely has 'UX design scandals' of various sorts with dark patterns forcing Microsoft's preference on users. Apple has those as well, but far less often.
MacOS is crazy efficient and can overcommit quite a lot.
I used an M1 Pro for a couple years to work. 8GB of ram but routinely using 12GB including swap.
Now, I couldn’t keep slack and outlook open so there were limitations but I was able to work. People are underestimating the usefulness of 8GB of RAM.
I guess it is also worth saying that I do my work by connecting to a remote server where I do the actual development and everything else. The Mac itself being a web browser and ssh machine
Not being able to keep Slack and Outlook open at the same time seems like a pretty significant productivity hindrance to me. 8GB RAM is truly pathetic in 2022.
Children don't have Slack and Outlook open. Gmail in a web browser and Discord, maybe. My old M1 Air works just fine for productivity workloads, and has for years.
Not sure about slack vs discord, but browser Gmail is almost certainly less memory hungry than Outlook. And that’s probably enough of a difference by itself.
VS Code (or rather VSCodium in my case) is also electron based but it's been relatively snappy in my experience - though I don't use a lot of third party plugins.
> Not being able to keep Slack and Outlook open at the same time seems like a pretty significant productivity hindrance to me. 8GB RAM is truly pathetic in 2022.
I read this as how bad software quality has gone down, that a mail program and a chat program don't fit in 8GB of RAM.
Nobody except people on HN cares about RAM. People care about what you can actually do with the machine. The spec numbers are nothing more than numbers when a computer never works as it is supposed to. It's like having a 500HP car, but it can actually not drive.
Indeed, 8gb is plenty, even for serious work and coding, if you use the machine well.
If you think getting more and more RAM solves every performance problem, I've got news for you: People are having beachballs on machines with 32GB and more.
I agree generally that on Mac you can 'get by' with 8gb and for the target audience on this, and how they'll likely use it - it's totally acceptable.
But if it's for serious work, this is not the device. 'Managing' the software to 'use the machine well' to get serious work done is unacceptable in 2026. It needs to just work and disappear into the background. I have enough to think about and micro managing the software running is out of the question.
> 'Managing' the software to 'use the machine well' to get serious work done is unacceptable in 2026
I agree, I just don't think the rush to get more and more RAM and storage is the root of the problem.
Why on earth does a browser need more than 10 GB to display web pages?? Why does macOS keep piling/hiding trash that should be deleted in "System Data"?
And, if you need to keep device backups, put them on an external drive; that's what those things are for.
It depends on how you define "serious work". Is it to get the best results possible, or is it to tax a computer as much as possible? Programmers would usually answer the latter, while users would answer the former.
That's why programmers put their stuff into Kubernetes which go into virtual machines, which go into eleven layers of javascript abstraction which go into twelve thousand node packages, which go into something else to end up with something with very basic functionality, which usually doesn't work very well.
Other pro computer users are focused on the results, so they use professional office software, calendars, communications, photo and video editing and effects, photo-realistic 3D editors, studio level audio and music editing software. All which lives perfectly fine on 8GB of RAM.
As always - it depends on the kind of ostensible "serious work" you do.
I've got 32GB and often work with legacy .NET Winform/WPF applications on a Macbook. That means spinning up a Windows 11 ARM distro virtual machine and running Microsoft Visual Studio. The VM has 8GB of ram allocated to it, and based on qemu-system memory pressure, it hovers around ~4-6GB of that.
I also do a lot of colorgrading and video editing with longform 4K videos using Davinci Resolve - scrubbing in an uncompressed format would absolutely thrash the hell out of your swap with only 8GB.
Add much as I'd like to be more efficient, modern toolchains absolutely need these kinds of numbers for big projects. My 48GB system will OOM trying to link clang unless I'm extremely careful. The 64GB system is a bit more forgiving, but I still have to go for lunch while it's working.
Sure, might be ambitious to do that sort of workload on a budget conscious laptop, but it'd be nice y'know?
Rust exists. If you insist on using (or need to use) languages with horrendous build architectures like C++, then you probably need a proper build server then anyways.
I don't have XCode on my Macbook and have resolved not to do iOS development any time soon (although ideally I'd have wanted to dabble in it sometimes), because I've accepted I don't want to run the rat race of always needing beefier and beefier machines to keep up with Apple's bad habit of bloating it up for each version up for no good reason.
I don't run local LLMs on my machine, since even with 100s of GB of RAM, I hear the performance you can expect is abysmal.
I think it is a good idea to put pressure on hardware and software vendors to make their products more efficient.
I literally just ran into this myself with my spouse. She is ready to upgrade her M1 MacBook Air and thinks she doesn’t need more RAM because everything is “in the cloud”. Hopefully 8GB is enough RAM for the next 5 years or so...
My spouse bought a mac and asks me (mostly a linux user, and I'm happy to help) for support somewhat regularly (mostly recently, for a tahoe upgrade). It's not the golden unicorn people paint it to be. 8gb is insane in 2026.
It may not be a golden unicorn, but I find it is quite a lot better than providing support for the Windows laptops they used to buy from random department stores on rock-bottom sales... Nothing quite like a $200 PC laptop stocked with OEM bloatware
I like my MB Pro but it has serious audio and external display issues.
I've had to remove spotlight indexing to prevent obscure OOM issues.
One time I woke up to open my laptop and find it's screen cracked for no apparent reason. Since I couldn't prove it wasn't my fault I was charged for the repair anyway and I'm grateful to myself that I had AC+ because I might have as well just bought another laptop if not.
At the end of the day, it's still just a computer.
"good laptops" yes. But I haven't seen a "great" one in a very long time. The Windows market is asleep at the wheel and a copilot button is not going to resuscitate it.
I think the Surface is as close to great as you can get. I'm not saying that I know the whole market of laptops, you probably know better. But the Surface is pretty good, which is weird because it seems like Microsoft isn't really focusing on it or even backing away from it.
I agree with the parent, that Macbooks are way ahead in terms of usability, polish and charm for a laptop. And the performance is outright stellar.
> Other than Microsoft nobody even makes decent laptops in the Windows world.
I completely agree. I actually quit like and get along with my Surface Laptop. It's a really nice computer overall, worthy. It's the closest you get to the same polish and usability that Apple has in their macbooks.
I absolutely love my M4 macbook pro, it's definitely the best laptop I've ever owned. I had an older macbook pro that I kept way past its lifetime too.
I think the problem is that Microsoft's hardware quality is super inconsistent. We had a ton of employees using Surface laptops and tablets at my previous company, particularly sales and support. The company stopped buying them after a few years because the first year failure rate was almost 15%. However, the folks that had the good ones often kept them for 5 years or more.
Don't know. Plugged it in one morning, and it wasn't turning on. So I tried detaching it from its base because that had been a problem before, but it was dead, so you needed to find the little manual release thing that's inside one of the vents, which I didn't have a tool for, so I gave up on that. Then I turned to ask a coworker to try their charging cord (as mine had to be replaced once after it failed, and I assumed the same thing had happened again), and by the time I got back to my desk a small whisp of smoke was rising from the keyboard, which is strange, because to my understanding that's not where the computer bits are in that laptop.
So I unplugged it, at which point I noticed the smoke was increasing. So we doused it in CO2 (maybe N2, idk, some cheap gas we had lying around for the wetlab), pried the computer part off of the base, and then IT handled sending it back to M$.
Not if you move windows between screens with different scaling, or launch apps that don't support the scaling stuff out of the box, or launch apps via X11 forwarding in WSL.
All of this works much worse on macOS: Scaling sucks, as it's integer-upscaled rendering + fractional downscaling in a shader. Windows can't span screens either.
On Windows, the window will adapt as you move its center of gravity across the edge of the screens. Sure, could be better than at the moment where the window is the wrong size, but it would always be blurry.
I don’t think it is just a hardware issue: Windows still just maps all movements and scrolling directly into pixels and lines. Most programs just slightly blur the viewport when scrolling to hide the latency, but that just adds even more latency. You can disable the scroll delay in the web browser settings, but not any of the new applications, like the new notepad
I have the Lenovo X1 and I'm very happy with it, though obviously that's in a pretty different price category than the Yoga, Surface, or Macbook Neo.
On the other hand, more money doesn't always mean better computer. I had a Dell XPS 9570 at a previous gig that had a lot of issues: coil whine, bad camera placement, terrible thermals, etc.
I think you're undervaluing touchscreen capability, which even the cheapest laptops offer now. Kids and non-tech folks have come to expect it by default.
Now that Apple is attempting to compete in this space, they'll have to pitch these folks on what macOS without touch capability offers over Windows with touch capability.
Maybe it will still sell well enough, maybe people aren't that stuck on touchscreens.
> it uses 150% scaling (as opposed to the ideal 200%) which means you have subtle display artifacts
I agree with you, but I’m afraid Apple doesn’t agree with us. The recent MacBooks do not use 200% scaling out of the box anymore. It is a setting that only nerds use. I have no reason to believe that out of the box the default settings on this MacBook Neo will use 200% scaling either.
I think macOS applications feel like they have mostly updated to use the native resolution, so arbitrary scaling works great now. My comparative experience with a new Windows laptop is how I remember macOS felt when they first made high density screens many years ago: lots of render bugs all over, and every program has to be re-opened when I plug in an external screen to be usable at the new resolution
We should probably have nicer scaling algorithms that account for Moirés. Also, when you see a Moiré, that’s because you are scaling a bitmap that has periodic dithering. These should be more rare now, and a good opportunity to replace them with vector images with periodic patterns that are tuned for physical dimensions rather than pixel count.
It's interesting, for years I have been trying to make my iPad a nice, slim laptop I could bring with me everywhere for lighter/coding specific tasks. I've gone through several keyboards trying to make this work. It never has.
Now with this laptop, I can do exactly this, while being cheaper than what I've been attempting to do with an iPad.
The ARM64 Surface Laptop is great and definitely matches the MacBook Air's quality, but yeah, there's no way it is competitive with the new Neo offering from Apple at current prices.
I hope this leads to a general decrease in price for laptops, but with the RAM crunch I don't see that happening…
ahhh....good catch. i used the wrong name: i meant surface pro 13, which has the detachable keyboard. yeah, it's odd that the surface laptops are lower res.
What about color quality? I've used high resolution laptops with shitty washed out colors, but one thing I've always appreciated about Apple's displays is their vibrance.
too late to edit: i was thinking surface pro (the detachable keyboard) not surface laptop with attached keyboard and weirdly low res screens, compared to the surface pro series.
Or a MacBook, which is part OP's point. Apple is delivering quality at price points that Windows OEMs aren't (which is sort of the opposite of the phone world).
The experience I have had with Thinkpads, both current-gen and old during my childhood, did not warm me up to the line. They are not particularly better in feel, thermals and screen quality against its cheaper alternatives including those from Lenovo themselves. The only good thing was its keyboard, but then most Lenovo laptops in general have good keyboards. Its popular acclaim is weird to me.
Their challenge is, how do we halve the price and yet deliver twice the quality? I think they are going to realize they can't. Some of them will leave the market.
Dell has 50% more market share than Apple and HP 2x Apple's market share in the PC space. I doubt they'll be exiting the market because Apple launched a cheap laptop with 8gb of ram and using USB 2.0 ports. Most corporations are still tied to Windows apps and the MS ecosystem in general.
Both of which look identical with no obvious markings which is which. I'm sure this will generate no confusion amongst consumers who will have no issues whatsoever with this. /s
"you're plugging it wrong" will become the new version of the classic "You're holding it wrong"
Yes, the Mac Neo will tell you you’re plugging in to the wrong port!
And while the ports aren’t labeled, if you plug an external display into the “wrong” port, you’ll get an on-screen notification suggesting you plug it into the other port. [1]
What are the odds the notification functionality was only tested to work with Apple's overpriced first party accessories like 79$ USB cables, and will have countless issues and edge cases with third party accessories?
I'll just chime in to say that not everyone cares about the features you mentioned that much. Keyboard, touchpad, looks are the last things I think about when comparing laptops. Not to lessen your preferences, just to point out that there's a variety of viewpoints.
To make a different point, a regular consumer does not care about tech specs. They want a laptop that can browse the web, stream Netflix, and maybe open a Word doc. They will be more sensitive to hardware problems in my opinion. A janky touchpad is going to be annoying no matter what computer task you're doing. A wobbly keyboard will be the same. To me an average consumer is more interested in the "feel" of the computer rather than what it can do.
Last time I was shopping for a laptop, I needed battery life, low glare, high screen brightness, rugedness was a plus. Cheapness is a good proxy for rugedness. Being able to upgrade/repair components is generally something I value highly too. Something that's made to be maintained, meaning opened, disassembled (and reassembled!), feels good to me.
Used thinkpads and dell latitudes, battery and brightness aren't always what I'd like though. Frameworks and similar sound nice, but can't bring myself to pay the premium.
I have a framework and love it - repair-ability is exceptionally important to me, and I support it as often as I can.
That being said, I have a really specific use-case I have to fill right now: I travel all the time for work, and my work laptop already takes up a good amount of space, so I need something small and easy to use when I travel
I guess a lot hinges on what kind of work you need to do while travelling. I've been on tmux and vim for the longest time which works great over mosh, so almost any device worked for me, as long as I had an ok internet connection. Spent a summer working in a park, on an epaper tablet and a bluetooth keyboard. Good times.
Sometimes I wish I could just use any laptop and Remote Desktop into my gaming rig which is awesome. Then I can have whatever form factor laptop I want, but the problem is I think the latency still sucks (maybe not?) on stuff like Parsec even locally.
The latency is acceptable. I host a remote gaming rig accessed through Parsec, the extra encoding/decoding latency is minimal. Distance has the largest effect, I found that over roughly 1,000 km my total latency is about 30 ms, perfectly playable for all but the most competitive FPS.
I don’t have WiFi 6E, just 6 (5ghz). If the input latency is imperceptible for productivity work, and the resolution matches my laptop resolution without pixelation, then why the heck am I not streaming my powerhouse main pc?
I agree. I read this and immediately thought to myself: The gloves are off.
The price point, the capability, the only thing stopping Apple at this point is the MDM stuff integrating it with other identity providers but its ahead of where it used to be.
The MDM stuff is there now, and platform SSO works pretty well, at least with Entra and Okta (the only two I have experience with). Both JamF and InTune support it, I'm sure all the other MDMs do as well.
The only time macs can be a bit of a headache is if you are still using all on-prem AD & group policy and trying to force them into that environment via joining the mac to AD.
Microsoft is forcing everyone onto Azure AD or whatever so that should fix that.
Last time I dealt with Apple MDM was integrating it with on-prem AD and it was a pain. I know it’s better now because last few “gigs” have used it and it’s been pretty seamless with Microsoft Authenticator for Teams. (Ugh!)
> Other than Microsoft nobody even makes decent laptops in the Windows world.
Strong disagree on this one - there are some great laptops available, they just aren't "macbook clones". I have an Asus Rog Strix that I love. Lenovo have great ones, Dell, even HP is back in the game somewhat.
I use a macbook professionally, but still don't like the keyboard very much. The display is good, but my Asus display is better. Aluminum is pretty, but I don't like the feel of it on my wrists.
> The best laptop is now significantly cheaper than the horrible ones.
Possibly, but I would wait for reviews to make that call. The hardware is slower than other MacBooks; memory may be slower, too, and other hardware may be slightly worse in quality.
> Your new MacBook Neo.
Just the way you want it[sic].
13-inch MacBook Neo in Indigo
A18 Pro, 6-core CPU, 5-core GPU, 16-core Neural Engine
Apple Intelligence Footnote ※
8GB unified memory
256GB SSD storage
U.S. English Magic Keyboard with Lock Key
20W USB‑C Power Adapter
Two USB-C ports, 3.5 mm headphone jack
Support for one external display
8 GB unified memory is brand-new e-waste today. macOS 26 makes it even worse.
> 8 GB unified memory is brand-new e-waste today. macOS 26 makes it even worse.
One reason Apple can get away with 8 GB of RAM is their SoC does realtime compression of data in RAM and they use high bandwidth memory; the A19 Pro RAM bandwidth is 60 GB/s. This enables them to treat the SSD like an L3 cache.
It's nearly 5 years since the M1 was released; I suspect Apple has gotten really good with their RAM > compression > SSD system since then.
I will take MKBHD's take on this[0]. Great as a higher-end 'chromebook' etc. Could be an upgrade for my Surface Go 3 but not as portable. Definitely more useful than a tablet.
I'm going to give Apple the benefit of the doubt here until proven otherwise. I can't see them releasing something with a terrible user experience as it would cause a lot of reputational harm.
In the year that I had a Surface, I can count on 2 hands the number of times that I used the touch screen. Out of all those times that I used touch screen functionality, the majority of the times were done inadvertently when I was trying to get something off the screen. I'm willing to bet a lot of people won't/don't care about the touch screen, they just want something cheap.
I have a Windows laptop with a touch screen. The only time I touch the screen is when I take a screenshot using the Snipping Tool and want to circle something.
At 150% scaling, one logical pixel maps to 1.5 physical pixels. When a 1px grid line is drawn, the renderer cannot light exactly 1.5 pixels, so it distributes the color across adjacent pixels using anti-aliasing. Depending on where the line falls relative to device-pixel boundaries, one pixel may be fully colored and the next partially colored, or vice versa. This shifts the perceived center of the line slightly. In a repeating grid, these fractional shifts accumulate, making the gaps between lines appear uneven or "vibrating."
Chromium often avoids this by rendering 1px borders as hairlines that snap to a single device pixel, even when a CSS pixel corresponds to 1.5 device pixels at 150% scaling. This keeps lines crisp, but it also means the border remains about one device pixel thick, making it appear slightly thinner relative to the surrounding content.
For some people such artifacts are not noticeable for others they are.
I'm one of those people who are super sensitive to the issues you describe, and let me tell you this: Scaling value (like 150%) is just an integer number.
For the most part, non-ancient renderers (3D but also to a large degree 2D renderers), do not care about physical pixels, and when they do, they care the same amount no matter what the DPI is.
Raster data has a fixed number of pixels, but is generally not meant to be displayed at a specific DPI. There are some rare applications where that might be true, and those are designed to work with a specific display of a given size and number of pixels.
It's especially older applications (like from the 90s and 00s) that work in units of physical pixels, where lines are drawn at "1 pixel width" or something like that. That was ok in an age where targetted displays were all in the range of 70-100 DPI. But that's not true anymore, today the range is more like 100 to 250 or 300 DPI.
One way to "fix" these older applications to work with higher DPIs, is to just scale them up by 2 -- each pixel written by the app results in 2x2 pixels set on the HiDpi screen. Of course, a "200%" display i.e. a display with 192 DPI should be a good display to do exactly that, but you can just as well use a 160 DPI or a 220 DPI screen and do the same thing.
It's true that a modern OS run with a "scaling" setting of 150% generally scales up older applications using some heuristics. Important thing to notice here is that the old application never considered the DPI value itself. It's up to the OS (or a renderer) how it's doing scaling. It could do the 2x2 thing, or it could do the 1.5x thing but increase font sizes internally, to get sharp pixel-blitted fonts when it has control over typesetting. And yeah, some things can come out blurry if the app sends final raster data to the OS, and the OS just does the 1.5x blur thing. But remember, this is an unhappy situation just for old applications, and only where the OS receives raster data from the App, instead of drawing commands. Everything else is up to the OS (old apps) or the app itself (newer, DPI-aware apps).
For newer applications, e.g. on Windows, the scaling value influences nothing but the DPI value (e.g. 150% or 144 DPI) reported to the application, everything else is up to the app.
Sorry none of that makes any sense to me. Go to BestBuy where they have Surface laptops on display. Open the browser and go to a website where a grid with 1px horizontal lines are displayed. I immediately notice that the lines are disproportionately thin. You may not notice it and that's fine.
I've started out with a longer reply, but let's try and condense it a little: I concede you can still find this issue, especially on less than 4k displays, but it's becoming less and less of an issue, because of improving software. Where you see the issue, it's simply a software problem -- CSS or Chrome or the website/app should be fixed.
I don't see much of this issue anymore, on my 27" 4K screen, set to 175% scaling.
It's logical that if you want to do arbitrary scaling or zooming, and want to keep all distance ratios perfectly intact, you will experience some ugliness from antialiasing (less and less noticeable as you go to ~4K and beyond). That will be so regardless of scaling, even when it's set to 100%.
So if it ought to look good, software simply needs to be written more flexibly! 1px in CSS doesn't mean 1 physical pixel but it is a (quite arbitrary) physical distance, defined as 1/96th of an inch. It's all up to the app and the software stack deciding line widths and how they will actually come out on a screen in terms of pixels lit. They should _respect_ the scaling setting (like 150%) but they also are in full control, in principle, to make it look good.
<hr> lines come out are perfectly fine on my screen (175% 4K) with Firefox and Chrome.
1.5px width lines will come out quite bad with 100% scaling, but will look perfect with 150% scaling obviously.
Notice that vector fonts generally look better if you have a reasonably high-dpi display. But on average, it doesn't matter if you test font sizes of 20pt or 21pt or 17pt or whatever. Why is that? Because font rasterizers already snap to pixels. They properly quantice geometric objects. They don't make arbitrary stubborn choices like "it must be exactly 1/96th of a virtual inch wide" but they are a little flexible to fight antialiasing.
And the more higher DPI monitors there are, the less software will be making stubborn choices.
Except most Germans don't buy Surface Laptops, and there are much cheaper options with 8 GB, naturally they lack a glowing apple to show off at Starbucks.
It is also actually 800 euro if you want a proper SSD storage in 2026.
And as mentioed, get out of German economy, into the southern and eastern countries, or over the Mediterrean to see who gets a Neo outside the well in life families, or maybe bundled with a cable TV contract bound to five years.
I miss the glowing apple on my white polycarbonate MacBook. What I don't miss is the shitty Intel GMA X3100 iGPU and Apple not releasing a 64 bit driver for it.
Should have spent the money on a MacBook Pro with a real GPU, I would have used that computer way longer than I had.
what computers are you buying that are more environmentally friendly? The MacBook Neo is 60% from recycled materials and Apple offers free recycling for all their products.
> what computers are you buying that are more environmentally friendly?
Any computer that you can upgrade its parts? SSD, RAM, Wifi cards, etc.
The only parts that wear out on a modern laptop are the SSD and the battery. If I replace those, I can use it basically indefinitely, paying the penalty on performance and energy consumption depending on how old the CPU is.
Why would I throw out (or recycle) a perfectly good computer if I could simply fix or upgrade it? If you're not reusing it, then you could pass it down to somebody who would use it.
20+ year old computers are e-waste at this point thanks to software bloating and lack of hardware acceleration for at least h.264.
15 year old computers are very usable, but unfortunately most use SATA for storage which is definitely not optimal for SSDs.
10 year old computers are from when PC tech plateaued, for most use cases the difference in performance is imperceptible, and maybe you lose power efficiency.
nowadays macbook batteries aren't something i'd call "easy to replace" but it's not something a typical repair shop or meticulous individual wouldn't be able to do – most beater windows laptops don't have user-replacable batteries either fwiw
if the ssd is bricked you do need to replace the whole "logic board" tho which sucks
Being able to add RAM and replace storage with faster flash typically extend the useful life of a computer, even if the CPU is not replaceable as in desktops.
On my machines the limiting factor is not CPU, but memory and GPU speed. Low RAM and slow GPUs prevent me from running local AI models. These things will only get bigger over time. I wouldn’t expect a developer machine to still be useful 5 years from now with less than 16 or 32GB.
The Apple logo is on the wrong side of the screen to be concerned about. Apple's OS and user experience is miles ahead of the competition and so are the displays they use.
I disagree about Apple's OS and UI, I prefer the user experience of Linux :)
With a distro like Linux Mint or Ubuntu everything basically "just works", and you have much more freedom with how you setup your computer. Plus, while Apple is generally better about not bloaring their OS with bothersome corporate BS ("log into your Windows account! Sign up for OneDrive! AI in your email!") then Microsoft, they're not exactly perfect.
The Mac OS is the thing that keeps me away from those computers. I really don't like when a piece of software tries to treat me like I have some kind of brain injury and needs to "help" me at every point.
The post that I commented on was arguing that what sets the Mac apart from other options with 8 GB RAM, and what makes them more expensive, is that they are seen as a status symbol. I made a point against that mentioning two areas in which Macs are truly superior.
To each their own. The OS is easily one of the most frustrating I’ve ever been required to use. It does some things very well, but many things absolutely infuriatingly.
Now, yes, almost everything about Apple’s hardware UX is a light year ahead of most competitors. That’s been true for ages.
Apple says the magic keyboard is "much loved." Eh, I can't think of anyone I know who uses a MacBook who agrees with that, even the biggest Cupertino fanbois. It's a terrible typing experience for touch typers, as others have commented here.
Making MacBooks thinner and thinner creates diminishing returns when it comes to the keyboard. The "butterfly" design isn't very sturdy and on both my previous MacBook Pros several of the keys stopped working after a few years and had to be repaired.
all apple needs, to kill surface laptops entirely, is to enable windows to run on m series laptops without issues.
I don't know why the downvotes, maybe someone can chime in if there is more to surface laptop? because i am using one laptop, and much prefer to use windows on M4 macbook pro instead.
The games industry remains a hotbed of people that vehemently hate Apple, even those that have never touched a Mac.
Part of it historically was a sort of Visual Studio induced Stockholm Syndrome, where for a long time if you were doing C++ work that was the only sane way to go.
There are some companies that even filter potential employees on this basis.
Apple leaves gamers alone, it does not even attempt to be a nontrivial gaming platform and makes no promises. Why would gamers and gamedevs hate it? It just doesn't exist in their market.
Bootcamp was a hedge when Apple was a lot less dominant than it is now.
When Apple transitioned from PowerPC to Intel, it wasn’t clear that was going to work. Being able to boot into Windows was sort of an insurance policy that’s no longer necessary.
It made migration from Windows to Mac easier. Now that Office can run in a browser and the Mac has first tier support for the desktop version, and a lot more of the usual software is delivered either as web applications or portable apps on top of a browser runtime, being able to boot Windows is a lot less relevant.
Oh man I'd pay premium for a sleek device like this that CAN'T run Windows or Mac OS. Just a mere thought of being able to run pure Linux on that sweet sweet apple silicon with full driver support makes my juices flowing.
Google Docs might handle 50% of people who use Office, but I doubt if it handles 25% of Office use cases. Few use every feature of Office but someone uses every feature of Office and all those power user cases that are different can’t be handled by Google Docs. Or even Web Office.
You can have Gemini write the tests and Claude write the code. And have Gemini do review of Claude's implementation as well. I routinely have ChatGPT, Claude and Gemini review each other's code. And having AI write unit tests has not been a problem in my experience.
reply