Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A full-blown OS with an office suite fit on something like 100MB drive with everything opening and responding to user input in a snappy manner. And it didn't show random ads or try to track your every action. You could chose when and if to install any updates. All the while running on a 486 with a few megs of RAM. How did we get to the current state of software?

Edit: I remember a few places still running 3.11 well into the 00s because it just worked for the given tasks, got out of the way and didn't require a super-computer to run. The performance difference was especially jarring when Vista came out and barely worked on contemporary hardware.



> How did we get to the current state of software?

Though playing mp3 was a heavy task consuming most of the CPU. It was hard to do anything while listening to music.

Or think about how pressing Ctrl-S once a minute became a reflex, because MS Office crashed so often.

Hell, anyone remembers how unstable Windows ME was? Or how desktop Linuxes were practically non-existent?

Current state of software is _amazing_ compared to what was available at that time.


Windows NT 4.0 existed at the same time with the same UI and without the instability. Effortless MP3 playback was primarily made possible by hardware advances, not software ones.


From my memories NT branch was unstable until at least XP SP2. Not sure if it was because of the OS itself, or because most of the software and drivers were written for Windows 98, and required few years to catch up.


Windows NT was rock solid until Windows XP was released, but GPU performance was meh and Windows games generally didn't work well unless they were written to target NT. As I recall, there was one commercial graphical strategy game for NT at the time.

XP broke the subsystem isolation of NT in favor of enabling Windows non-NT types of drivers with higher levels of kernel/GPU efficiency due to less isolation. This also brought in the ability of the drivers to crash the kernel.

And that, my friends, was the end of rock solid Windows NT stability.

I was on the Windows team during the XP SP2 years. Stability was somewhat reachieved by an intense focus on defensive coding to detect rogue driver and the establishment of the WHQL, or Windows Hardware Quality Lab. WHQL was basically an investment in driver analysis tools and a moderately sized team inside MSFT who's sole job was to debug and fix other people's drivers.

It's not a good replacement for isolation though, and it requires sustained continuous effort by both MSFT and the windows hardware partners, which imho hasn't continued.


Windows 2000 was fairly solid as long as the device drivers weren't too exotic.


> Though playing mp3 was a heavy task consuming most of the CPU. It was hard to do anything while listening to music

Cyrix MX166 PR200 with 32 MB of RAM was playing MP3 just fine while compiling the (linux) kernel and being snappy in the same time.

The SW has just become more bloated. SW engineers those days are not constrained by resources and this leads to a lot of bad practice.


I remember in the 90s having to actually expand my MP3s back into WAVs so my poor CPU could play them in realtime.


In the 2000s once talked my mom into getting me an old Mac Centris 610 from eBay. I attempted some C programming, but I had the most success reencoding MP3s from the Final Fantasy 8 OST and copying them over to the Mac on floppies lol


Two comments in this thread (this one and a sibling) mention needing to save often due to programs crashing, as if reliability of the software is solved by showing ads and tracking activity, or is a function of the size of the program or the speed of the computers or software, the things FeistySkink had mentioned.


Not sure about the ads, but telemetry definitely increases reliability. It allows to debug very specific issues with a comprehensive information about user environment. Source: debugged proprietary app multiple times using telemetry information. Sometimes the issues discovered were not reproducible without recreating some very specific environments.

Surely there are downsides to telemetry and tracking, but they can be useful for development.


MS would like to dissagree. There are bugs which are not closed after years (mouse registered double clicks, blacked out parts of GUI, programms starting on inexisting screens).


Snappy is not what my memory of the IO on those machines was


486 was certainly not snappy, though Win95 did technically run even on a 386 with 4MB of RAM (i know because i tried it back when Win95 was new).

However on the Pentium MMX@200MHz with 32MB of RAM i had ~1996/97 it was very snappy. On the same machine i also ran my first Linux, Caldera OpenLinux 2.3 in 1999, with KDE1 where it also felt very fast.


I find it hard to believe disk was snappy compared to modern devices, since original comment mentioned IO. Cpu/memory are down to program behavior in addition to hw. Disk, more size and hw. And HW improvements have dwarfed size increases (rather, facilitating them to some degree).


Depends on what sort of disk I/O you have in mind. Floppy disks were never snappy, but HDD disks were fine for the most part until the OS started using them for virtual memory. Keep in mind that things were much smaller these days - i remember thinking that my 2GB HDD would never fill up :-P.

Also in the last few years disk I/O optimization took a massive nosedive, especially under Windows. My 2013 laptop has a ~5400rpm HDD was so unusable with Windows 10 that even the login screen crossfade animation stuttered from disk I/O and updates took literally hours. Meanwhile Windows 8, released only a few years prior, was able to boot my older 2008 laptop from power on to desktop in ~22 seconds (i was so impressed at the time that i uploaded a video of the boot process [0]).

If you are used to the recent awful I/O performance it can be hard to imagine things not always being like that.

Besides you can also check a bunch of YouTube videos with old computers running Win95, here is a short one i found that goes through the bootup process and clicks around the OS for a bit.[1] The computer in the video is actually slower than the one i personally had too.

[0] https://www.youtube.com/watch?v=Ti3LQHXZ0Qg

[1] https://www.youtube.com/watch?v=yfh5ZcDhdZA


There was plenty of software and games that could fit on a floppy or two.


Haha I had the exact same thought. I still hear that “brrbrbrrrbrre” of the HDD arm frantically trying to deal with the overload of doing anything whatsoever.


I do miss the sounds of older, more mechanical machines though. You could be doing something else in the room while your Very Long Task was running, waiting for the silence that signaled it finishing.


The IO at the time was terrible, also because,if memory serves me right, plenty of disks/controllers didn't support DMA transfers.

But if you had a few megabytes to spare for caching (not everybody did, for sure) the machines would be snappy, because the UI was.


I don't recall it being snappy and I also recall it being crashy. But it was a lot better than what came before it. Given the hibernation Apple was in at the time, it was the best alternative. Between the 95 launch and OS X 10.3 was the only time I used Windows as my daily driver.


> Given the hibernation Apple was in at the time,

System 7 and Mac OS 8 were pretty good from a UI perspective, but no memory protection and no preemptive multi tasking was not great. For a long time "MacOS 9" was synonymous with classic Mac, but I always remember 8 being the last tolerable one, by the time 9 came around the foundations were more clearly inadequate, it was like the WinME of Mac OS, tiding over until OS X.

> OS X 10.3

Yes. I remember some commentary on the first few versions of OS X was that they were resource hogs.


it was a lot worse than the sgi workstations i was using in 01994

also, it was a lot worse than the bsdi bsd/os ("bsd/386") workstations in the same room with them, and those were the same hardware win95 ran on. and it was a lot worse than my friend's linux box


I'm talking about general UI responsiveness. Perhaps this is just nostalgia.


Watching 90's GUI programs is always fun because you can see the graphics as they are drawn, which I don't seem to remember from back then.


If you're nostalgic for that, there's plenty of modern software with a watch-it-paint-itself UI. <coughSagecough>


I have never heard anyone call Sage "modern software" before : - )


Now we have spinners that twirl while we wait for graphics to transmit from another continent.


And GUIs which redraw the whole screen when the cursor blinks. /s


It is. Windows UIs didn't get Amiga levels of responsive until Windows 98SE or so.


And then lost them again during the gap between Vista's launch and SSDs becoming ubiquitous...


I had some of the first SLC SSDs. It didn't help at all. Windows 7 was a massive improvement though.


I have a SSD on my work machine. Win 10 is still slow.


There's some truth to it just for vendor-supported graphics driver reasons alone.


Fast on a Pentium 75, dreadfully slow on 486/25


A 486 with a few megs of RAM coukd not run Windows 95 in a snappy manner. It would swap like bugfuck. To do Windows 95 usably, you probably need 8 MiB minimum, 12 or 16 MiB ideally.

Windows 3.x you could get away with 4 MiB. And that and Office 1.0 could fit in 100 MiB. Even then, we complained about it.


linux at the time could theoretically run in 2 mebibytes but worked a lot better in 4


According to Wikipedia, Office 95 required 28 to 88 MB of storage.


> How did we get to the current state of software?

You mean how did we get to a point where :

- I don't have to mash Ctrl+s because software crashers constantly

- segfaults are not a daily occurrence

- any random executable can't crash the OS (hell modern windows can survive a GPU driver crash)

- I can use the internet relatively safely

- I don't need to restart my device after every install

etc. ?


Completely agree. Modern software is so much better than old software. Fuck the "ooh it runs on 100MB of RAM" I don't fucking care. All I care about is that it's so much more reliable than before, I can actually get my shit done now.


On rare occasions that I have to boot Windows to run some government-compatible software on a 32-core, 64GB RAM and I-don't-even-care-to-check-how-fast SSD, I have to listen to it firing on all cilinders while Defender fights Malicious Software something and Windows Updates while I'm typing some text in a barely responsive page. Truly getting my shit done.

Edit: Forgot all the random software doing auto updates and notifying me about it as I go.


That's the price of living in a world where you're connected to the internet by default - a huge amount of performance goes towards security (sandboxing/scanning/rules). Windows 9x didn't even have a firewall FFS.

Windows has to chose defaults for a huge audience, if you daily drove it it would probably just work. Way better than 95 would.


Eh, no. Linux and bsd had process isolation and ran packet-filtering with little effort on contemporary hardware.

You should also be aware that early android/ios phones did all this in "modern times" on limited hardware by today's standards.


> if you daily drove it it would probably just work

More like, "if you daily drove it you'd get used to all the awfulness".


No, you can add defender exceptions/disable it and updates are staggered.


> > How did we get to the current state of software?

> You mean how did we get to a point where :

> - I don't have to mash Ctrl+s because software crashers constantly

Office 365 would like to dissagree. Not constantly but not reliable either.

> - segfaults are not a daily occurrence

Just because Teams just dies without writing something on the console.

> - any random executable can't crash the OS (hell modern windows can survive a GPU driver crash)

Yes, but a docking station is the last straw. Power management on USB is hard.

> - I can use the internet relatively safely

Relatively

> - I don't need to restart my device after every install

But after every update ...

> etc. ?

We are living on a planet that's revolving and evolving ...


Hot take: developers today are nowhere near as talented as those of 1995. Every single dev in 1995 was the equal of a principal SDE today. You had to be, because unless your code met that bar, it just didn't work.

There were no guardrails. There was no borrow checker. There was no ASAN. There was no valgrind. Christ, you barely had a debugger. You couldn't copy and paste answers from ChatGPT or Stack Overflow: you either RTFM and comprehended it, or your code didn't work at all.

Today, we spend an enormous amount of compute, memory, and latency compensating for the modal developer being unable to do safe memory management or concurrency.


Windows 95/98 had a lot of problems due to its DOS legacy. A better focus for discussion of what was possible at the time would be BeOS or QNX.


We got to the current state because you left out most of what the OS does now. That's like comparing a 787 to a Cessna, and stating "airplanes used to be light and easy on fuel - how did we get to the current state of the airplane?

I have a work account that I can use to log into all my devices. All my docs and even the stuff on my desktop is magically synced and backed up, and I even have versions. I can open a visio on my laptop in the kitchen, draw some stuff while I wait for the coffee, then go sit on my desk and keep working on that file. When I go for a refill, it's still open in the kitchen and I keep working. Yes - one document, open at the same time on 2 machines, changes being synced live on the screen w/o closing.

When I change my preferences to not select the entire word at a time, it magically saves in every outlook install on all my machines. Plug something in and the driver for it is magically downloaded and installed. Have disk? Remember that? Different sizes and shapes and density monitors? Monitors - many of them.

Gramma can use the computer now - you don't need a geek. Yes, the gramma interface is annoying for the tech user, but the tech user can easily put in a new window manager or tweak registry - the new dumbed down default opened computers to everyone - not just geeks. Tracking? Hmm, don't see any. Maybe because you can (a) turn it off, and (b) you can install a version w/o it. Yes, if you install the consumer version of Windows made for gramma, it will track everything. That's because it's for gramma, and tracking helps make the next version easier for gramma.

You also don't want gramma to be able to turn off updates, because she will. Updates on my machine? They were set to download and notify - never install, by default. But no, you don't get that on the gramma home version.

Do you ever get into a cessna and complain they don't have room for all the 40 members of your group? That's not a valid complaint. That's a bad choice of right tool for the job.

The OS is big because the 787 is big. And I say all that, as someone who hates MS and Windows with a deep, long, hard, smelly brown passion.


Does any of this require a supercomputer? How has the computing paradigm fundamentally change in 30 years?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: