Everybody and their dog have heard about Moore's Law. Yet few know about a far more important Yannis's Law, which states:
Programmer productivity doubles every 6 years
There's no question that today it's possible to get a lot more functionality in less time and with smaller teams than in the past. The really shocking part is magnitude of the difference.
In 1972 paper "On the Criteria to Be Used in Decomposing Systems into Modules" David Parnas wrote:
The KWIC index system accepts an ordered set of lines, each line is an ordered set of words, and each word is an ordered set of characters. Any line may be "circularly shifted" by repeatedly removing the first word and appending it at the end of the line. The KWIC index system outputs a listing of all circular shifts of all lines in alphabetical order. This is a small system. Except under extreme circumstances (huge data base, no supporting software), such a system could be produced by a good programmer within a week or two.
Fast forward to 2003, Yannis Smaragdakis wrote:
I would not consider a programmer to be good if they cannot produce the KWIC system within an hour or two.
"As hour or two" seemed seriously excessive, so in 2007 I timed myself coding such a system. It took me 4 minutes and 11 seconds to create and test the following code:
res_lines = 
words = line.split(" ")
n = words.size
res_lines << (words[(i+1)..n] + words[0..i]).join(" ")
That's 500 to 1200 times faster than a "good 1972 programmer". Maybe my Perl/Ruby background makes me a bit unrepresentative here, but would anybody call a person who cannot implement it in 15 minutes (like during a job interview) a "good programmer" ?
The progress is not over - you can get a huge productivity boost by moving from Java to Ruby, from PHP or J2EE to Ruby on Rails, or by seriously doing test-driven development. That can get you from 1990s to 2000s. And there's plenty of people who are still in the productivity 1980s - not using SVN, not knowing regular expressions and Unix, manually recompiling their programs, some are even doing things are insane as coding application-specific network protocols and data storage formats! By the way, there's a cool movie by some NASA guy, which times development of a small website in multiple web frameworks - the difference is really significant.
Some people actively deny this progress. The most famous is Fred Brooks's 1986 essay "No Silver Bullet - essence and accidents of software engineering", which claimed that there's simply no way 10x productivity increase could possibly ever happen. Guess what - since 1986 programmer productivity already increased by about 11x (according to Yannis's Law figure) !
So programmer productivity doubles every 6 years. Every 6 years it takes only half the time to do the same thing. How many other professions can claim something like that ? Is there any human activity for which it is happening consistently for decades ? (Actually I wouldn't be surprised if the answer was true, even if with somewhat slower pace of improvement.) This means a few things. First, because fewer and fewer people will be necessary to do today's work, programming must expand to new areas - and maybe you should also look there. Second, drop the things keeping you in the past and move on. Do you really want to waste your time coding C if you can code many times more effectively in Ruby ? Whether it's an Open Source project, a job, or your personal thing, it just makes no damn sense to accept this kind of productivity loss. And third - if you are some sort of person making technology decisions for your company or organization - "because we have always done it that way" is terrible reason for choosing any programming-related technology. What they're making today is without doubt better, and programmers are really good at adapting, even if they don't want to admit it.