Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Let that sink in. And you know what is not arriving anytime soon? Even “cat-level” AGI is just not going to happen within next 20 years at least.


20 years ago, the Linux kernel version was 2.5, we had the Intel Pentium III, Apple just released the Lamp, Nickelback was Top 1 with "How you remind me?" for over 16 weeks, Warcraft 3 got released and I bought my first Palm Tungsten.

This reads like another world because it was and I suspect the world will change even faster in the next 20. We will have cat like AGI before 2042.


Your examples work against you, I think. So, in 20 years we have some incremental improvements. I think ‘82 - ‘02 was a bit more dramatic than your example. AGI requires crossing a barrier that we don’t really know is realistically possible with the type of technology we have currently.

There are a lot of pop crap radio now, WoW just released another expansion, the latest iPhone has some pretty forgettable features compared to previous iterations, etc. It’s not like the world is exponentially different in 20 years.


The jump from 134,000 (82) to 9.5 million (02) transistors was a significant one, but it's nothing compared to the leaps and bounds made when transitioning to 20 billion transistors (2022). It's a staggering difference that truly demonstrates the magnitude of the advances made in the last 20 years. The 80s and early 90s were computational winters.


I think you should go back Patterson’s book. The benefits of Moore’s law ended and then what happened? I think what you are trying to suggest flies in the face of all conventional wisdom of computer hardware.


Moore's law ended? Because Nvidias CEO said so to have an excuse for their dud of a generation? ASM, TSMC, Intel, Apple and AMD are saying it's still going strong.


All things considered, there isn't much of a difference between 2012 and 2022 consumer computing. 4/8 cores, all pretty much clocked the same, maybe 8GB/16GB of RAM and 500GB of storage. At best, we optimized the process, optimized edge cases, but not it is certainly not an exponential leap.

Between 90-00, we went from 40MHz CPU to 1+ GHZ, 1GB to 100+GB storage capacity. A factor 25 for clock speed, and 100 for storage. By the 90-00 pace standard, in 2022, we would have 25GHz CPU, 1PB disk, but we topped at a few TB at best, consumer. D instruction per core went from 0.33 (1992, Intel i486DX), to 4.1 (2002, Pentium 4 Extreme Edition), to 8.46 (AMD Ryzen Threadripper 3990X), so in 20 years, we had less advancement (x2) than in 10 years (x12).

Beside that, Moore's law stated doubling of "metric" every 2 years, 20 years is 10 doubling period, so we should have scaled by a factor of 1000. So yeah, Moore's law is long dead.


Moores law has nothing to do with performance and the metric is transistor count and like my example in the previous post we exceeded your 1000 figure...


Do you have any idea who Patterson is?


Why even bring him up repeatedly when he himself says it's not dead yet. Maybe it goes belly up in a year or two but at the moment it's not dead.


Many believe that AGI will develop independent of our understanding of the human brain, cognition, and consciousness.

I personally find this quite strange — as if intelligence and cognition can be brute forced without having a good understanding of those concepts and how they apply to humans.

Anyway, I’m hoping to do some reading on this topic in the future. If anyone has suggestions for books and papers of much earlier works please post them here. I’ve enjoyed the Chomsky interviews and I’ll probably read some of his references.


I think the rate of change over the next 20 will be slower. Mostly just because the rate of change in computing power has slowed down. AI training compute has been doubling every 6 months. But how long can that continue without fundamental improvements in hardware? At this rate would need to increase compute by 1000x by 2027.



> This reads like another world because it was

So...another world with a slightly different taste in music and less advanced consumer electronics? Pretty picayune difference.


Ever gone out in a small town or village? Not much changed in the last 20 years, it was mostly at home. Take away the flatscreens, smartphones, tablets and calendars (doh) and nobody could tell you if it's the 00s, 10s or 20s.


Uh, yes, that's my point. Now I'm not sure which side of this question you're on.


The one that says that despite almost everything staying the same everything changed. We can go into socioeconomics if you really want...


Hard pass. Thanks for your time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: