Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, I think calling him the "unenlightened" one is pretty off base here.

For performing the tasks outlined by his examples, Unix utilities are easier for the user as well as executing faster than writing your own code in a general purpose programming language, unless one puts in the time to tune the implementation.

One could rebuild AWK in C and get similar performance, but why not just use some extremely simple AWK? And is anybody going to replicate the amazing speed of grep without a huge time investment? [1]

There is a right tool for the job, and having seen dozens of programmers be exposed to new big data sets, I can tell you that the ones who become productive quickly and stay more productive are the ones who adopt whatever tool is best for the job, not the ones that stick to their one favorite programming language. In fact, a good sign of somebody who will quickly fail is someone who says "forget those Unix tools, I'm just going to write this in X".

[1] http://ridiculousfish.com/blog/posts/old-age-and-treachery.h...



This is one area where I wish the Unix philosophy (reuse of tools) was taken a bit further. Too me, every command should be callable as a C library function. That way you wouldn't have to parse the human readable output through a pipe. Not only that, there needs to be both human-readable, as well as machine-readable output to all commands. For example I would love to be able to call "ps" from another script and easily select specific columns from an xml or json output.


PowerShell solves this problem by piping around objects instead of strings. It's pretty neat!


Shell scripts can be pretty powerful if you know what you're doing, but I do agree that sometimes the shell script paradigm can be more of a hurdle than a help.

However your point about every command being a callable as a C library is kind of possible already. Some commands do have native language libraries (eg libcurl), but you could also fork out to those ELFs if you're feeling really brave (though in all practicality - it's little worse than writing a shell script to begin with). In fact there's times I've been known to cheat with Perl and run (for example):

    (my $hostname = `hostname`) =~ s/\n//g;
because it's quicker and easier to throw together than using the proper Perl libraries (yeah, it's pretty nasty from an academic perspective, but the additional footprint is minimal while the development time saved is significant.

Of course, any such code that's used regularly and/or depended on will be cleaned up as and when I have the time.

As for your XML or JSON parsing; the same theory as above could be applied:

    use JSON::Parse 'parse_json';
    my $json = `curl --silent http://birthdays.com/myfriends.json`;
    my $bdays = parse_json($json);
    print "derekp7's birthday is $bdays{derekp7}";
Obviously these aren't best practices, but if it only running locally (ie this isn't part of a CGI (etc) script that's web accessible) and gets the job done in a hurry then I can't see why I you shouldn't use that for ad hoc reporting.


But one of the unix philosophies is to use plain text. To have everything as a C function means everything needs a new API




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: