In the 1970s, 80s, and early 90s computers downsized from mainframes to minis and ultimately to networked desktop computers and portables. IBM, DEC, Microsoft, Apple, Adobe, Oracle, HP, Compaq, and Dell dominated the new markets and hired the next cohorts of programmers who generated the next waves of code. Computer applications were becoming essential tools for more and more professions.
By the late 1990s the Internet, developed in the early 1980s as a government defense project, became the primary engine driving the world's economy, an engine fueled by billions of lines of code generated by millions of programmers, now called "software engineers". IBM, Microsoft, Apple, and Oracle were joined by newcomers like Netscape, Yahoo! and Google. Computer applications had become consumer commodities.
By the end of the first decade of the New Millennium, computers had downsized yet again as desktops and laptops (i.e., much smaller portables) gave way to tablets and smartphones, thereby creating the largest increase in demand for more software -- and more software engineers -- in the history of computing.
In 1964 only a small percentage of the population had access to computers. Fifty years later in 2014 all but the poorest members of our society had smartphones -- very small, powerful computers that they carried in their pockets and handbags. Not only was there need to port old applications to the new platforms, but the fact that users kept these devices with them all the time propelled the explosive growth of social media -- e.g., Facebook, Twitter, and Instagram -- applications that enabled users to interact with each other 7 by 24 by 365 in ways that were unimaginable on previous platforms.
Indeed, this last wave has surged so quickly that demand for software developers has consistently outpaced the supply, a fact that has led many observers, including government employment analysts, to forecast even larger employment opportunities for software engineers in the foreseeable future. For example, the Bureau of Labor Statistics (BLS) distinguishes between higher level "software developers" and lower level "programmers." BLS predicts that the number of jobs for software developers will grow from 1,018,000 to 1,249,000 between 2012 and 2022; BLS also predicts that jobs for computer programmers will increase from 313,700 to 342,000. If these optimistic trends held firm for the following ten years, the total number of developer and programmer jobs would exceed 2,000,000 by 2032.
The Last Wave
In my opinion we are in the midst of the last wave of human coders.
This is not an apocalyptic prediction of an all-knowing Skynet or a Singularity or a Matrix wherein artificial intelligence (AI) systems surpass human intelligence and may (or may not) rebel against their inferior human "masters." One doesn't have to believe that AI systems will become smarter than humans within the foreseeable future. One only needs to anticipate that clever computer scientists and software engineers will develop code generators that will produce code that is more efficient, more reliable, and far less hackable than the code produced by the vast majority of today's human programmers.
Why will they do so? For three reasons: (1) because it's such an exciting Grand Challenge, (2) because the producers of effective AI-code generators will reap billion dollar profits, and (3) because our society is in peril if it continues to grow ever more dependent on "hand-made" hackable software systems. (Note: 2014's shocking series of extensive security breeches of the IT systems of the nation's largest banks and retailers shows that our current "hand-made" software systems are unacceptably vulnerable.)
Just as technology reduced farm employment from 38 percent of the labor force at the beginning of the 20th century to less than 3 percent by the century's end, I anticipate that AI-assisted code generators will generate upwards of 90 percent of new computer code sometime within 20 to 30 years. Contrary to the optimistic predictions of the BLS, the unprecedented speed of innovation in IT makes it far more plausible to expect that the foreseeable future will only require a relatively small number of exceptionally talented software engineers and computer scientists -- perhaps hundreds of thousands, instead of the millions predicted by the BLS.
In other words, we should anticipate that sometime within the next 20 to 30 years -- and possibly much sooner than that -- the world's IT elites will create powerful AI-enhanced integrated development environments (IDEs) and other tools that will enable these elites to develop and maintain the AI-enhanced code generators that will produce 90 percent of the code for the world's most needed computer applications.
Computational Thinking Skills vs. Coding Skills
Question(s): Given these anticipations, should readers learn how to code and, perhaps more importantly to some, should they encourage their children to learn how to code?
Answer(s): Yes!!! Definitely.Why? Because as computer programs take over an ever larger share of the lower level cognitive tasks in more and more occupations, long-term employment will only be secured by those who know how to convert challenges in any field or on any job into two components ==> the "computable components" that can be resolved by the application of appropriate computer applications and the "non computable components." The capacity to engage in computational thinking will enable successful professionals to identify existing software that can address the computable components and/or to specify new software tools that could be developed to do so. At present, there only seems to be one way to develop computational thinking skills ==> by learning how to code.
Of course it goes without saying that computational thinking skills by themselves will be insufficient. Professionals must still know a lot about the knowledge domains that underly their fields or occupations. So doctors will still have to understand human biology and lawyers will still have to understand the law and elementary school teachers will still have to understand childhood development. Indeed, as many have suggested, computational thinking skills acquired via courses in coding, should be regarded as fundamental generic skills, like reading and writing, that students should be required to develop and enhance throughout their lives from pre-school to K-12 to post secondary studies and beyond.