We are at the beginning of the Fourth Industrial Revolution. But how did we get here? Heather McGowan and Chris Shipley, in their book The Adaptation Advantage: Let Go, Learn Fast, and Thrive in the Future of Work, take a look at that, starting from the First Industrial Revolution.
The First Industrial Revolution, which lasted from 1760 to 1830, was primarily marked by the invention of the steam engine. The Second Industrial Revolution, generally dated between 1870 and 1914, is known for electrification and rapid standardization and industrialization. With the use of machines and the introduction of the Scientific Management movement, pioneered by Frederick Taylor, work became standardized.
As we transitioned from the Second Industrial Revolution to the Third, society experienced a growing need for skilled people trained beyond the trades. The demand stimulated a rise in a higher educated workforce, resulting in doubling the number of colleges and universities between 1960 and 1990. That period is referred to as the "massification of higher education." The idea of choosing the right major as a way to a good and stable career became the path to success.
The Third Industrial Revolution started in the 1950s and is often called the digital revolution. During that time, computerization and automation of physical labor took center stage. Now, we are living through the Fourth Industrial Revolution, where biological and cyber systems are merging into a fully digitized economy.
The Second Industrial Revolution emphasized hyperspecialization and expertise, leading universities in the Third Industrial Revolution to train students in a single skillset in one industry. While today companies are still having a hard time filling positions in data analytics and cybersecurity, many students are graduating colleges with debt, unable to monetize their education. By the time they graduate, the skills they were taught in college are often either outdated or irrelevant. As indicated by the World Economic Forum, a skill's shelf life is less than five years. The idea that colleges can train students for their entire career is by itself outdated.
We have been so focused on technological advancements and developing machine capabilities that we started to train humans to act more like machines. We test people on how well they retrieve information or how well they structure data to make computers understand. Schools have been obsessing with STEM programs, seeing them as a competitive advantage for future generations. Although STEM jobs pay a premium initially, Burning Glass has found that salary premiums decline by as much as 50 percent over the following ten years.
McGowan and Shipley argue that while technology skills depreciate, human skills appreciate. We need to develop our uniquely human skills if we do not want to be displaced by AI and automation. The IBM Institute for Business Value study has found that the behavioral skills gap is now more significant than the digital skills gap. World Economic Report released a report, The Future of Jobs, in which they listed ten skills we will need to thrive in the Fourth Industrial Revolution. Those skills are:
- Complex problem solving
- Critical thinking
- People management
- Coordinating with others
- Emotional intelligence
- Judgment and decision making
- Service orientation
- Cognitive flexibility
McGowan and Shipley add adaptability as the "number-one skill in demand." None of the mentioned skills is technical. As research has shown repeatedly, the skills we will need in the future of work are human skills: generating original ideas, looking at issues from various points of view, making nonlinear connections, collaborating with others, and putting ideas together to create something new. McGowan and Shipley underscore, "When we use phrases like 'digital transformation,' we are really talking about 'human transformation.'"