In my experience it's been the exact opposite -- back in the "old days" most programmers simply didn't know about linearized space, which is why even Adobe Photoshop does plenty of incorrect calculations that way. And because there wasn't any internet, there was nobody to tell you otherwise.
These days you can at least find references to it when you look things up.
I can confirm, when I started computer graphics, I had absolutely no idea about linear space. I never had a formal education about it though, mostly random tutorials, demoscene stuff, things like that.
I think one of the reason is that in the "old days", in many cases, performance mattered more than correctness. Models were, overall, very wrong, but they were fast, and gave recognizable results, which was more than enough. And working in gamma space as if it was linear saved time and wasn't that bad. That gamma space somehow matched CRT monitors response curve was an added bonus (one less operation to do).
But things have changed, with modern, ridiculously powerful GPUs, people are not content with just recognizable shapes, we want some degree of realism and physical correctness. Messing up color space in the age of HDR is not acceptable, especially considering that gamma correction is now considered a trivial operation.
Not knowing about linear space means that people were using linear by default, right? That’s what I would assume. Early games and all the graphics I was exposed to up through college all used linear RGB, but just didn’t call it that, and of course RGB isn’t a real color space anyway. Most people didn’t know about gamma correction, and otherwise almost nobody converted into non-linear spaces or tried to differentiate RGB from something else. Color geeks at Pixar and Cornell and other places were working with non-linear colors, but I guess most people writing code to display pixels in the 70s & 80s weren’t thinking color spaces at all, they just plugged in some RGB values.
According to wikipedia, sRGB is a standard created in 1996, so yeah, it just wasn't used earlier. However at the end of the millenium you could create software that opens an image file saved in sRGB, and unknowingly apply some algorithm, like dithering, without converting it to linear space first.
There was gamma correction and other perceptually uniform-ish color spaces before 1996 and before sRGB. I was taught about the CIE set of color spaces xyY/XYZ/LAB/LUV in school and used them to write renderers before I’d ever heard of sRGB. And yes exactly right, before they know better, a lot of people will grab something non-linear and starting doing linear math on it accidentally. It’s still one of the most common color mistakes to this day, I think, but it was definitely more common before sRGB. People sometimes forget basic alpha-blend compositing needs linearized colors, so it’s a common cause of fringe artifacts. Things have gotten much better though, and quickly. A lot of game studios 20 years ago didn’t have much in the way of color management, and it’s ubiquitous now.
That is software targeting Mac and Windows.
Adobe has been notoriously inept at getting color right, except for print.
Already in the old days there was Digital Fusion (now integrated as 'Fusion' into DaVinci Resolve, I think it was e.g. used on "Independence Day") and Wavefront Composer (SGI/Irix, later ported to Windows NT but I may misremember).
Also depends where "the old days" start. I got into CG around 1994 and then "the bible" was "Computer Graphics'?" from Foley et al.
And aforementioned newsgroup and also comp.graphics.rendering(.renderman)
Software that was written in VFX facilities and then became OSS didn't suffer from this as most color computations happened in f32/float, not u8/char per-channel and colors were expected to be input linearly.
Often the DCC apps didn't do the de-gamma though. So there was an issue at the user interface layer.
But in the early 2000's the problem was understood my most people working professionally in CGI for the big screen and all studios I worked at had proper color pipelines, some more sophisticated than others.
As far as OSS 3D renderers go, there were Aqsis and Pixie.
Krita was linear from the beginning, AFAIR. I.e. I recall using it for look development/texture paint on "Hellboy II" -- that was 2007 though.
These days you can at least find references to it when you look things up.