Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Think of the 100s of millions or even billions of hours of unhappiness caused by whatever utter wanker designed the windows update system. All that misery waiting to go home to your actual life but you can't because your laptop's displaying 'Do not turn off or unplug your computer'. No, just sit here and watch me. Or those times you accidentally opened Word or Photoshop or MySql Workbench (god that's slow) or whatever. You made a mistake? Well fuck you. I'm not listening to you, says your computer, I'm just going to ignore you for a little bit.

That's why.

And they're not enormously complicated, paint opens instantly and has the same UI. Word is little more than a glorified textbox. Programmatically all the features make those programs enormously complex, but the actual common use of the programs are simplicity itself.



Have you used a recent copy of Word (eg. Word 2010) or any of the other Office software? It all opens basically instantaneously on my system, and my system is like 4 years old now. They still show the splash boxes this guy is complaining about but the splash boxes are blink-and-you-miss-it fast and then the real app is there. Less than a second from click to load of the real app.

The Adobe apps still do have more significant load times, more like 5-8ish seconds, but RAM is cheap, just load that app once and keep it open forever. Hibernate your system instead of shutting it down cold so that the apps don't have to reload from initial state when you next use them. This also makes Windows pop up very quickly (but even from cold boot modern Windows only takes like 15-ish seconds on my (again 4 year old) system), which isn't anything to complain about compared to any other desktop OS out there.

All in all, I'm in the group of people very confused about this post. I'm historically someone who absolutely hated long load times, but it isn't something I've worried about for any app or OS I use in years now.


You think they put that notification there to annoy you? No, it's because of possible data loss and/or system file corruption if you turn off the power while the system files are being patched. Splash screens exist because the program is loading. If you want the functionality of Paint, or Notepad, open those instead. If you want Photoshop, you open Photoshop, and it takes a while to load because it is gigantically complex. If you want it to load faster, buy a new computer.


Surely Windows updates are applied via some sort of transactional system that can be rolled back and re-attempted if the process fails for whatever reason?

Having a power cut during a Windows update shouldn't result in a completely trashed system.


Surprisingly, NTFS does allow for transactions [1] and this feature is implemented since Vista, while as far as I know, HFS+ on Macs has no comparable feature. Therefore what you say may in fact be true.

I'm curious if cutting the power on a Mac while it's moving files into place will break a software update, or if the whole package receipt mechanism prevents that from occurring.

[1] http://en.wikipedia.org/wiki/Transactional_NTFS


I'm not sure even transactional NTFS would protect you in this case. From the wiki link:

Transactional NTFS is implemented on top of the Kernel Transaction Manager (KTM), which is a Windows kernel component.

Because this is implemented on top of the kernel itself, if you have brought down the kernel in order to update files within said kernel, you likely are not going to be able to leverage the transactional rollback. You might able to do a system restore, if you boot from CD, but breaking your kernel is not an easily recoverable situation. I suspect there are actually safegaurds in the update procedure which protect against this situation, but things can go wrong and it is really not something you want to have to rely on.


I was thinking more of having a transactional system within the update software itself independent of anything on the filesystem.

Something like this:

1. Download all compressed archives that are required for the update from the update website and unzip somewhere.

2. Check the package manifest and figure out which files need to be changed/added/deleted.

3. Write a flag somewhere on the boot drive that says the update process has begun and which files will be altered.

4. Make copies of all the files which will be changed.

5. Work through the update process by modifying or overwriting the copied files with the contents of the update archives.

6. Temporarily suspend the scheduler so the update process is the only thing running and release locks on all of the files which will be changed.

7. Work through every file that needs to be changed and link the filesystem reference from the old version to the new version whilst keeping a copy of the old version.

8. At every stage in 7 mark in a log which references have been updated.

9. Mark a flag to indicate that the update process has been completed, either resume the scheduler and re-instate locks or force a restart of the OS if necessary.

When the system next starts up as part of the bootup process it can check if both the transaction start and finish flags are set. If the start flag is set but not the finish flag then it knows that an update failed so it can roll back by re-linking to the old versions of every file (reading the logs to know which files to re-link) and setting the start flag back to 0 so it can try again.

If the update was successful then it can delete the old files if the disk space is needed or keep them around in case there is an issue later which required a restore.

In regards the kernel example, my Linux install actually keeps old versions of the kernel on the system so that if a kernel update breaks something for whatever reason it is still possible to boot the system from the previous kernel. I imagine Windows and OSX do something like this , although possibly more transparently.

Note: This is what I could think of off the top of my head, I'm sure it's not a perfect way of doing it but it demonstrates the idea.


I don't know if it follows those exact steps, but in the last few months I had several times a machine crash (flaky power supply) in the middle of various Windows Updates and it always recovered pretty well. It looked to this outsider like there was some sort of journalling going on.


On HFS+ on OS X, most file writing is done atomically, by writing to a temp file and atomically switching the (conceptual) data pointer of the target file, aka FSExchangeObjects. It's common to end up with some of these temp files in ~/Library/Preferences, ending in .plist.asdfx, when the process was interrupted (so the original remains untouched).

This can be done for directories too - make a new one, write changed files and hard link unchanged ones, then FSExchangeObjects. Obviously this scales linearly with the number of files in the directory, so it's not perfect. I'm not sure if / how any of this is used for system updates, though.


> You made a mistake? Well fuck you. I'm not listening to you, says your computer, I'm just going to ignore you for a little bit.

The programs from Office 2010 have a _x on the splash screen which lets you easily minimize/kill the program in question. It's really snappy, works like a charm.


Don't like that feature of Windows Update? Turn off automatic updates and do it manually when you wish.

Although I agree with the annoyance of accidentally opening something like Photoshop and having to wait to close it. But a good number of resources that is loading is third-party stuff that demands to be present immediately on startup as opposed to being loaded on demand. But whatever.

Paint has the same UI as what? I hope you're not comparing it to Photoshop or any other professional level software. I believe the modern version of Word is a good bit more than a glorified textbox.

But then you point out exactly why these programs load slowly in your last sentence. Maybe just start using less complex software?


Their interface is the same. Are you utterly blind? Stop looking at the UX as a programmer and imagining the complexity of the features. The actual screen is displaying exactly the same stuff. There's a big white box to draw in and bunch of buttons to press.

You do not need to initalize all the 3rd party tools, you just have to find out which tiny little icon to add on a task bar or extra menu items to add, and why didn't you cache that when the extension was first detected?

And how silly of me, of course all the corporations these days let you turn off the updates on their computers don't they?

Stop telling your users that 'if they just..', do the opposite with UI than you do with your code, start programming for the best case scenario, not the worst case.


The interface may be the same but I'm willing to bet under the hood they are quite different on a large scale. You seem unwilling to admit that.

I don't understand what you are saying about third party tools. Could you clarify? Is that whole sentence talking about the same thing because I read it as having four different topics in there.

If a corporation won't allow you to adjust update settings then that's an issue to take up with IT since they are preventing you from doing what you need, not the software itself. What if the default settings were exactly what you wanted and your IT department prevented it, is that also the fault of the software? I think your complaint is misdirected in this case.

Ask ten different people to define best case scenario and I bet you'll get several different answers. Which one do you choose? You can't make everyone happy but I guess since you're not happy then your way is the best choice? What you complain about telling users "if they just..." is what I call choice. When I say an application won't do something I need then I WANT the answer to be "if you just do this..." so I can have that choice.


I once wrote a clone of Microsoft Word for a company looking to move away from Office dependency - I think you're really understating the complexity of something like Word when you call it a glorified textbox or say it should load as quickly as MSPaint because they both use the ribbon UI.


>I once wrote a clone of Microsoft Word for a company

Out of curiosity, when was it? Wasn't openoffice.org an option?


The chief complaint I have with HN's lack of reply notifications is that interesting questions like this one will go unanswered forever.


>Their interface is the same. Are you utterly blind? Stop looking at the UX as a programmer and imagining the complexity of the features. The actual screen is displaying exactly the same stuff.

Seriously? I mean, my Dodge Neon has the same gas, brake, clutch and steering wheel that a Ferrari 599 has, why doesn't it 0-60 as fast? Stop comparing engine size and complexity, it has all the superficial elements of a sports car! God, you're such a mechanic.


I think I understand what he's saying. Sure, your Dodge Neon may take longer to go 0-60 (i.e. it's okay if Photoshop is slower overall), but there would be something wrong with it if it takes significantly longer just to get in the car (i.e. Photoshop shouldn't take longer to boot up).


No, I think the engine metaphor is pretty apt. I completely understand what he's saying, but the alternative he seems to imply is no better. Why would I want the interface to load but not be able to use it for another 45 seconds? Okay, so I'm in my car (clicked on the shortcut) but I won't be able to go anywhere (use the application) for a few minutes because it's slow to accelerate (load). Who cares about the max speed of your car as long as it 0-60s in a reasonable amount time?

Either way it's going to take some time to load because it's a big application. One way lets you know it's working, the other makes you think it has frozen as you're fruitlessly clicking around. The interface is the least important thing in the application until it's fully loaded.


I think it makes sense. To extend your metaphor, you can do things such as turn on the radio without noticing a performance difference between the two cars.

The idea with Photoshop is, sure, loading the entire thing will take much longer, but users tend to use one tool at a time, and loading an individual tool shouldn't take anywhere near as long.

It also makes sense from the point of view of parallelization. One example of an extremely slow operation is waiting for the user to click on something. So instead of sequentially loading things, followed by the user deciding what to do, Photoshop should continue loading while the user decides e.g. what brush size and color he/she wants.

Like any other, this approach has advantages and disadvantages - it would indeed be frustrating if the brush tool hadn't loaded by the time you started using it, and it would probably require a lot of work for Photoshop to load so modularly and on-demand - but all I'm saying is, I understand the value in the alternative he suggests.


To go beyond cars and Photoshop, the RPG Guild Wars is a good example of only loading what you need. In fact, it doesn't even download the other game zones before you need them. If it sees you running towards the next zone, it begins a buffer in the background to pull in the data. So instead of making you wait while the game downloads and installs everything right up front, it downloads then loads up exactly what you need. This is much harder in an application, which is more non-linear.


When you run a program the OS will usually just map the code into address space, it won't necessarily actually load the code from the disk until it needs to.

It's possible that what the program is doing at initialization is not loading code from the disk but doing something else, like perhaps checking it has a nice big contiguous area of disk to use for temporary storage or for loading some type of cache into.

There are many times in programming where you make a choice between taking a one-off up front cost to optimise something for faster overall performance vs slower overall performance without the setup cost.

For example with a DBMS you can lose some write performance by having an index on a table and rebalancing when you are writing but the advantage is much faster read performance.

Also A Java program can take longer to run the first time by JIT compiling the program for the platform it is run on but this will mean faster performance of the program itself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: