Many people find it hard to believe how much programmer productivity increased. For them I have two good examples - CGI script in Fortran, and Sudoku solver in COBOL.
People actually used to code stuff like:
Write(6,189)
Write(6,190) QS(7:7),QS(8:8),QS(9:9),QS(10:10),QS(11:11),QS(12:12)
189 Format('<Table border=0><TR><TD bgcolor=' '#FFFFFF' '>')
190 Format('Submitted color= 'AAAAAA' </TD></TR></Table>')
Write(6,193)
193 Format('<BR>*</TD></TR>')
Go to 320
else if(j.ne.0) then
and even:
DISPLAY 'PT3-W700 = ' PT3-W700.
DISPLAY 'WAGON-CNT-W700= ' WAGON-CNT-W700.
DISPLAY 'ARRAY-C:'.
PERFORM VARYING SUB17 FROM 1 BY 1
UNTIL SUB17 > 81
OR ARRAY-C-NUM (SUB17) = ZERO
DISPLAY ARRAY-C-NUM (SUB17) SPACE
ARRAY-C-POSS (SUB17)
END-PERFORM.
It's not just Ruby vs Python vs Java vs Haskell vs Erlang vs Arc thing. There are things far worse than Java. I'm not sure about COBOL vs C++ though.
2 comments:
Before Fortran and Cobol there was assembler language. Befoe that there was Binary and productivity via octal or hexadecimal.
Think of all the writing (programs,shopping lists,etc) that were done on the other side of paper containing Core Dumps.
Newer languages may take less code, but the old machines with so much less memory managed to do so much more, without graphical interfaces.
50 people could do work on a timesharing system whose memory today would barely be enough to run MS-DOS.
The records for an entire university could fit on a bunch of disk drives whose total content today is a fraction of an i-pod's capacity.
millions of dollars of insurance company records could fit on less than a Y2K PC's hard drive.
Is it more productive to spend an afternoon doing powerpoint work than taking a magic marker for half an hour?
Better hardware is one of many sources of Yannis's Law, but as you can see from these examples better languages are also very important.
Post a Comment