Artificial intelligence (AI) seems to have taken on the quality usually associated with dreams. Like those expressions of our hopes and fears, AI-powered robots on one side promise a future of leisure free of tedious and backbreaking work in which all can live as they imagine aristocrats of old once did with a huge staff eager to please. On the other hand, they offer a nightmare of widespread joblessness, poverty, and hopelessness for a large part of the population who, because of AI will never again find meaningful work. Sometimes, the same commentator paints one picture on Saturday and another on Monday.
Of late, the media seems to have focused on the darker side. Several prominent figures have wondered publically about the ill effects on society as robots leave millions with zero prospects for gainful employment. Democratic Party presidential hopeful, Andrew Yang, has built his campaign around these prospective ills. He claims that AI has already destroyed four million factory jobs across the Midwest and has warned of millions more jobs lost to robots and their ilk. He has trumpeted the need for a universal basic income (UBI), a stipend for all from the federal government, to answer the impending surge in joblessness as robots and other expressions of AI take over more and more functions in the workplace.
He is far from alone. Technology luminaries as Bill Gates and Elon Musk have made similar forecasts and similar calls to answer society’s need. The good and the great gathered not too long ago at Davos have heard such descriptions and such calls and taken them very seriously.
Much of this AI fear rests on calculations made by Carl Frey and Michael Osborne in a 2013 article published in the MIT Technology Review. There, they forecast that AI will eliminate 47 percent of U.S. jobs and 35 percent of UK jobs by 2035. Those figures have found their way into several articles and government reports in this country, Great Britain, and elsewhere. But theirs are not the only figures. Shortly after the Frey and Osbourne study appeared, the Organization for Economic Cooperation and Development (OECD) looked into the matter. It took a broader view of the question than Frey and Osbourne. Instead of finding risk in every task AI could do, it identified the potential for job loss only in those tasks that AI could do profitably. That distinction made a major difference. The OECD concluded that only 10 percent of U.S. jobs were at risk to AI and some 12 percent of British jobs.
Two considerations suggest that the second, less frightening study is closer to reality and that even it may overstate the ultimate damage. The first of these is the more economically practical way that the OECD examined the material and its willingness to examine more granular data. The second is that this is not the first time the public has heard panicked cries of what AI and robots would do to jobs. Such warnings have come and gone several times over the decades. Back in the 1960s, for example, when AI was called “automation,” a group of Nobel laureates warned how “new kinds of automation” had “broken” the “link between incomes and jobs.”
Then- President Jack Kennedy took those warnings to heart and spoke to the nation of the “dark menace of industrial dislocation, increasing unemployment, and deepening poverty.” His successor, Lyndon Johnson, responded to the potential threat by calling for widespread “family relief” to ease the prospective strain on working men and women – not quite a UBI but a government stipend nonetheless. All these fears and warnings emerged during what today’s alarmists might well consider a golden age of American manufacturing.
To be sure, those robots did replace jobs in the 1960s. New technologies have done so since machinery was first introduced into mass production in the eighteenth century. But the innovations have always changed things less quickly and less completely than alarmists at each stage claimed they would. The fact is that new technologies require a lot else to change before they can replace the jobs people fear they will take. Before business can use technological innovations effectively, it must change its practices and procedures and train those who stay on the job. That adjustment can take years, decades even. During that time, the marketplace has time to adjust to the new technology and it has always seemed to use both the time and the technology to create new jobs.
Take, for example, the personal computer. It took 20-plus years from its invention in the 1970s to become commonplace on office desks and shop floors across the country. What delayed the adaption was not the technology, though it improved dramatically over that time.
What took time was the alteration in practices in offices and factories so that they could accommodate the new machines and glean benefits from them. Only over this relatively long stretch of time did the PC revolutionize business and in the process displace hundreds of thousands if not millions of typists and clerks as well as dispatchers and the like. But if eventually, the innovations did replace people, that passage of time eased the social strains. Importantly, it enabled others to see how those same technologies could support activities that were previously inconceivable and employ people in them. During those 20-some years from the 1970s to the 1990s, the nation saw the rise of Federal Express and like firms as well as cable television, just to name two areas that now employ millions in jobs that did not previously exist and that directly or indirectly depend on the very computer technologies that destroyed so many other jobs.
Throughout the long history since the eighteenth century, when people first applied machinery to mass production, this same pattern has repeated over and over again. Each wave of change displaced a portion of the workforce but enabled the growth of new occupations that absorbed those displaced from older activities. If this pattern had not prevailed then each wave of technology since the days of the horse-drawn plow and the stagecoach would have left an ever-increasing portion of the population unemployable. But in fact, the economy on average has continued to employ 95 percent or more of those who want to work. This one irrefutable statistic demonstrates that innovations always create even as they destroy.
To be sure, it has always been difficult, impossible in fact, to forecast what new activities will create jobs for those displaced by technological innovations. Few if any have that kind of imagination, certainly not at the start of each new technological wave. Who in 1972, for instance, could envision an air delivery scheme that could make millions of deliveries the next day and track each parcel at each stage of the process? The jet planes were around, but not the computer and communications technologies that enable this kind of organization to function. It has always been because of this failure of imagination that people can only see only the destructive part of the picture. Still, a failure of imagination is no reason to dismiss the possibility that history has verified time and again. So it is today, even with Gates and Musk and Yang.