Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A couple of points buried in the talk: defects increase as the code size increases. The increase is greater than linear. So my question for folks is why would you program in Java rather than Python, where you are guaranteed to have more bugs, even with static type checking?

The issue of programmer productivity has been around at least since The Mythical Man Month, but the generic responses have been better estimates and better methodologies. There is even the Software Engineering Institute to make life worse for all of us. I think that except for academics and programmers themselves, no one cares. Large companies used Cobol and now Java because it is safe. Everyone else is doing the same. I once asked an executive why he picked Java over Python for a greenfield reimplementation of his site. he said it was because there were more Java programmers than any other kind.



"A couple of points buried in the talk: defects increase as the code size increases. The increase is greater than linear."

I went looking for these points in the slides, only because the slides got me in the mood to question my assumptions and try to follow citations back to source material - and it wasn't obvious that the presence of something on a slide meant that it was a known truth. (Example: the slides about Martin Fowler's claims about DSLs and productivity are in the slides as a skull on a pikestaff, not as a verified truth.)

After looking, I think you got the first point from slide 15, but I wasn't able to find a slide that mentioned non-linearity.

I like the question that you're asking, with regard to Python versus Java. I wonder, though, if that conclusion really rises to the standards of rigor that this presentation appears to advocate.

Even if you manage to dig up the right Lutz Prechelt citation, has the effect really been shown to be something that is completely unmitigated by static-provability?


Slide 25: "Most Metrics' values increase with code size". The point about non-linear increase in defects is my (unwarranted) extrapolation. An earlier slide quotes a 25% increase in complexity gives a 100% increase in size.

No, it really doesnt rise to his standards of rigor. Everything that I have read, supporting that conclusion is probably anecdotal. The formal studies I have read have usually involved college students on short projects. Studies that are probably worth less than useless. Others probably suffer from the Hawthorne effect.


Do you know if he meant larger as in 'wc -c' larger, or as in number of symbols larger? I'm curious because there are languages (Cobol, Obj-C) that are extremely verbose in terms of characters but not in symbols.


The classic study cited in Mythical Man Month (which I don't unfortunately have handy) said, if I'm remembering it correctly, that it was proportional to lines of code (thus 'wc -l' larger) and independent of the language used.

What I don't know is whether IDE-generated code gets you out of that problem or not.


He was quoting Fernando Corbato who was comparing lines of assembly to lines of PL/I in large projects. Corbato later recalls this in an article in Byte. Crobato's Law is mentioned in http://en.wikipedia.org/wiki/Fernando_J._Corbat%C3%B3


A study (http://page.mi.fu-berlin.de/prechelt/Biblio//tcheck_tse98.pd...) by the Lutz Prechelt mentioned on the slides suggests that, all else being equal, static type checking reduces defects and increases productivity (the study was for ANSI C vs. K&R C). So a concise but statically typed language (Scala?) might be a better choice than either Java or Python.


> Large companies used Cobol and now Java because it is safe.

"Socially" safe, that is. Damned dangerous in other, more neglected, senses.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: