Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Dennis Ritchie, the father of everything (popolony2k.com.br)
25 points by blearyeyed on Nov 3, 2013 | hide | past | favorite | 14 comments


Of course Totally ignoring Babbage, Ada Lovelace, Von Neumann, Turing, Shannon, Backus, Hopper, Flowers and many other giants.

And (this might be herasy) C ins't even a particularly good language when compared to better designed ones like Fortran and Pl1/G.

And i think McCracken is a better language intro than K&R


They're all important, but arguably Ritchie was more directly influential to modern computing from a practical perspective.

Fortran? Better designed? You're the first one I've seen who claims that. I'm assuming you're referring to the Fortran 90 standard and later, which cleaned up the language significantly and gave it a lowercase name. But the FORTRAN before that was simply ugly. I wouldn't call it Backus' starring moment. I'd say his research on functional programming and being a member of the ALGOL committee was much more important.

Fortran ended up becoming more suited to a completely different domain than C, anyway. Numerical and scientific computing versus embedded systems and low-level programming with high-level imperative mnemonics.

Haven't read A Guide to Fortran Programming, but K&R is still quite great, nonetheless. Although its habits are outdated by now in favor of ANSI C, it's still a handy reference.


It's ok as system programing language for the bits you dont code in assembly but for a General purpose language you have to do to much heavy lifting with C and its derivatives for my liking YMMV.

Oh and I agree with you on ALGOL BTW


Also original C isn't that huge of a step away from B. Some insane parts of C are remnants of B like undeclared functions are believed to return ints.


> Von Newman

It's von Neumann. And it's pronounced more like Noy-Man.


TY Updated my post


Dennis Ritchie is certainly an important figure that every computer scientist should know about, but that post reads incredibly slowly and is filled with grammatical errors. Looking at the author's other posts, it appears that English is his/her second language, so I won't fault them for it.

Wired put out a nice, concise obituary soon after his death[1] and there are many other great sources online about his life and influence. I would definitely recommend reading up on him out if you don't already know much about his life.

[1] http://www.wired.com/wiredenterprise/2011/10/thedennisritchi...


I just started reading K&R coming from a intermediate Python background. Ritchie's style is both simple and engaging. End-of-section practicals are labeled 'experiments', and the hacker spirit pervades the book.

Even better, the tools that I was implementing were useful and it was joyous to use a pipe to redirect the program I had just written to perform a char count hist on itself. How meta!

This may sound trivial for the numerous pundits and gurus who make HN their haunt (salute!), but the feeling of joy and discovery is immediate, and the suscpicion that I will walk away from this book a better coder (:s/coder/thinker) is inescapable.

Ritchie's thoughts lives on, his body is just dead.

edit: %s/Richie/Ritchie/g


Agree!!!

Almost all the modern software stack's components are basically written or build on top C code.

Linux, BSD, WinNT kernels are all build with on C.

gcc and all most all compilers, stdlib are written in C.

python, Java, PhP, lisp, smalltalk, NodeJS, are written in C.

Apache, Nginx source are written with C.

C++ is just add on layer to C.

Chromium, Firefox, webkit are all build on top of C/C++.

I haven't be able to find one core component in the modern OS/web stack that is not base on C.

Can anyone?


So that basically means there are security problems everywhere?


In that case why do HPC users prefer the Intel compilers then.

and arguably CISCO's IOS is a key component of the stack you are of course talking about full stack layer 1 thru 7?


Goodbye World!


If he's the father of everything, then he's to blame for everything being worse than it could have been with Lisp.

He's the Thomas Edison of this story, and I'm with Tesla.


I don't think C is at fault here so much as the fact that register machines are our primary model of computation.

Smug Lisp Weenies will be just that though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: