In a previous post, I posited that we are on the precipice of the human capital era, where investing in human potential offers the greatest return on investment. Here I explain why the skills gap, is no longer a “gap” at all, but what my friend and co-author Chris Shipley have coined as the gaping “Skills Abyss” that will never close. That, I believe, is actually be a good thing. Hear me out.
Since the dawn of the first hand tools, humans have been filling the skills gap, that space between the people with the know how to use a tool to do a job and the demand for workers who have that smarts. Though the expertise transferred from one skilled person to the next has changed drastically across the millennia, the methods by which skills are transferred from one to another have barely budged. Someone with expertise passed along that knowledge to someone who wanted to learn the skill. Over time, the trades created an elaborate framework of apprenticeship. The early church and then universities documented the known world and developed pedagogy and curriculum to pass that knowledge from generation to generation.
That process of documenting, codifying, teaching, testing, and mentoring has continued for centuries, even as the human knowledge base arguably has surpassed any one human’s ability to digest and learn it all. Still, it is a method that largely worked for knowledge of a particular type, what IBM dubbed “perishable skills”. Until now.
Perishable skills are those that are useful over time but ultimately become obsolete. When the world marched at a slower pace, a worker’s command of perishable skills might well have held up over a lifetime. As the pace of change accelerated, however, perishable skills expired more quickly as employers demanded a different skillset to meet a more modern application. Workers had to top up their skills to remain employable, filling the so-called skills gaps with the new learning that employers required.
For much of the last two centuries, workers (often supported by government programs, shifting university curricula, and employers themselves) chased that skill gap in an exhausting effort to keep up with ever-changing demands. Now, as the half-life of a perishable skill has become razor thin. In its report titled “The Enterprise Guide to Closing the Skills Gap,” IBM Institute for Business Value presents research that suggests that “skills generally have a ‘half-life’ of about five years, with more technical skills at just two and a half years.” As perishable skills are replaced by more efficient or effective technologies and processes, and preplaced more quickly, the skills gap becomes a skills abyss.
If your feet are cemented in the old model of skills acquisition and transfer, the idea that we will never again be able to fill the skills gap quickly enough to meet employer demand may be terrifying. Keep in mind that not that long ago, companies outlived careers and people outlived the “best by” dates of perishable skills. A deep dive into a particular skill set and occupational identity, then, would last a lifetime. That is no longer the case. As Chris Shipley and I have long argued current and future workforces will cycle through many occupations across longer careers and lifetimes. Our tight grasp on a single occupational identity becomes the single highest hurdle to our ability to rapidly re-tool our skills for an even more rapidly changing world, and therein lies the terror for many workers today.
If, however, you are an optimist like me and believe in the power of human potential, you might just see that we are entering the most exciting and liberating era of human work, an era that values all humans for their unique and creative contributions. The Era of Human Capital. Let’s take a closer look.
Diving into the Skills Abyss
For there even to be a skills gap, one human has to demonstrate a skill, the market must deem that skill valuable, and the demand for workers with that skill must exceed the supply. To keep pace with demand, a skill needs to be clearly and tightly defined so that knowledge of that skill could be transferred through classroom or on-the-job training. Ironically enough, the act of codifying a skill for worker training is the same process by which technologies are programmed to perform a skill, that is automation. Codified skills, it turns out, can in most cases be more cost-effectively “taught” to computers than to humans. Businesses focus on the bottom line; computers are cheaper than human. Therein is the basis for what we colloquially call “technological unemployment.” But let’s be clear: for a perishable skill to be automated, it must first be done by a human so that it can be understood, documented, codified, taught – all long before it can be handed over to technology.
Not so long ago, either the market or the academy would anticipate the market need for an emerging new skill or knowledge domain (for example, cybersecurity, machine learning, or data analytics) and then take a nearly a decade to meet that need. It might take two to three years to codify, build and approve the curriculum to teach that skill. Then, students entered a two- to four-year learning program to master the skill, before graduating, finding their first job, and over several years developing mastery of that skill. When change came slowly, as it did in the first industrial revolutions, that decade-long process worked well enough. Now, however, the change rate is simply too fast for this model to work.
Skilling and Industrial Revolutions
In the first (steam, mechanical) and second (electronification, division of labor for mass production) industrial revolutions, workers were trained, often on the job, to master perishable skills that would, in most cases, last a lifetime. These industrial revolutions focused on competence, training people to answer known questions with the right answers and/or mimic skills to perform to a benchmark level. In the third industrial revolution (computerization), workers were set on course to achieve higher education degrees as work shifted from physical and routine labor to cognitive knowledge labor. We needed people with the capability to port their learned competence to adjacent challenges. These workers need the cognitive abilities to apply their understanding of known challenges to find both the right question and the right answer to challenges that lay just beyond their realms of familiarity and comfort.
Since 2015, according to the World Economic Forum, we have been transitioning to the fourth industrial revolution that is rapidly automating physical labor and routine cognitive labor. Now we find ourselves in an environment that asks us to explore the unknown well beyond our studied area of expertise, to find and frame novel challenges, and to create new knowledge and value. Both workers and companies must continuously expand their capacity—their capacity to upskill, reskill, learn, unlearn, and explore frontiers beyond the known and the familiar. To do this well, we must – at every level – make a deeper investment in humans knowing that the single greatest return on investment will come from investing in human potential.
A report from The Economist Intelligence Unit titled Closing the Skills Gap captured this transformation succinctly: “The growing mismatch between the needs of business and the offerings of the US education system stems from the fundamental restructuring of the national economy since the 1970s. This shift and the transition to an increasingly service-based economy have led to working environments that require more and more collaboration rather than the performance of repetitive tasks or the operation of machinery.” In other words, first, the shift in our economy from a manufacturing base to a knowledge base began fifty years ago. Second, work is no longer about routine or predictable tasks. Third, most work now depends on the successful collaboration of teams. Bottom line: We need humans that can learn and adapt together and in doing so find and frame new challenges in order to create new value.
We need people who can explore the unknown, find and frame problems, formulate novel knowledge, and create new value. Here reskilling and upskilling will be a norm. Just like we add and remove applications on our phones, we will gain new skills and “delete” the old ones.
Let Humans Be More Human
Sadly, we are still stuck in the old model in which discrete phases of education are followed by work. It is a model that forces students through a sorting process to determine who is “university bound” and, where trade opportunities remain, who is “trade tracked”. The knowledge economy has forced more students than ever before onto the higher-education track, where university-bound students are funneled further into advanced or specific courses where they are rewarded for a myopic focus on learning the right answers to ace standardized testing. Students are motivated externally and rewarded for acquiring perishable knowledge, and too often discouraged from exploration, trial, failure and experimentation. This was never a good idea; the so-called achievement limited exploration and tightly bound identities to a specific knowledge base, making adaptation later in one’s career more difficult. As Gallup has shown, this process reduces student engagement and agency. Folks are left feeling helpless and often trapped by identities forged in the past. In short, we developed a system of education that reduces student interest in learning, creates fear of failure, and prepares present students with an expectation of certainty when the world they enter will be increasingly uncertain. Collectively, this old model makes it much more difficult to foster a workforce able to learn and adapt continuously.
We Need New Models
The merge of physical, biological, and cyber systems to create innovations such as self-driving vehicles, the internet of things, and advanced artificial cognition doesn’t require reskilling; it requires rethinking. The changes of the fourth industrial revolution are so profound that New Your Times columnist Thomas Friedman recently told me that he believes the industrial revolution framework is too limiting. Instead, he believes it is the third Promethean moment. In Greek mythology, the god Prometheus, steals fire and gives it to humanity from which civilization was born. Friedman believes the first Promethean moment occurred when Guttenberg invented the printing press, giving birth to the broader dissemination of knowledge. The second Promethean moment happened when the industrial revolution met capitalism, unleashing an incredible amount of human and industrial energy — which took the world a long time to learn how to get the most out of and cushion the worst. It eventually did so through welfare states and global institutions. He believes we are entering the third Promethean moment as three interconnected “climate changes” — rapid accelerations in globalization, technology and our natural systems — upending almost all our reference points. In this moment, Friedman believes, we need to rethink everything. While I agree with Friedman, I focus my exploration on work and learning, and thus, the fourth industrial revolution concept works for this particular discussion on careers, jobs, and skills.
The World Economic Forum predicts as many as 50% of our work tasks may be automated by 2025 and, as a result, half of the global workforce will need to be reskilled in that time period. While the velocity of change continues to accelerate, human longevity continues to expand and, with that expansion, the career arc extends. As Lynda Gratton writes in The Corporate Implication of Longer Lives, life expectancy has lengthened by 2 to3 years every decade since the 1960s. Folks now in their 40s may need to work to their 70s while people in their 20s may need to work until their 80s. If a career spans 35 to 45 years for those who started to in their late teens or early twenties, those starting work today may very well be looking at 60-year or longer careers. Even as our lives and careers get longer, the useful life of the skills we learn today is getting shorter.
In this circumstance, we need to focus new learning on knowledge that is durable, rather than skills that are perishable.
Durable skills are foundational. They remain useful even as perishable skills become obsolete. Most importantly, durable skills retain their value over time. Consider the research from Burning Glass: graduates in technical fields received their greatest income premium in their first jobs and that premium declined over time if those technical skills were not buttressed with durable human skills. In other words, technology skills depreciate while human skills appreciate. Similar Studies were reported by the BBC in the UK where those pursuing graduate degrees in law saw a premium when studying liberal arts as undergraduate students as opposed to studying law. In short, those technical perishable skills depreciate, while those durable, human skills appreciate.
Which is fantastic news. It directs us to invest in all those capabilities – creative problem solving, collaboration, empathy and compassion, purpose and vision that are uniquely human. It frees us to invest in people, rather than skills.
SEC Catches Up with the Human Capital Era
That need – to invest in people – is gaining traction among the Wall Street elite, as it becomes increasingly clear that humans are actually the ones that create all new value. In August 2020, the Security and Exchange Commission (SEC) mandated “human capital disclosures for all companies selling securities in the United States”. While we’re still not entirely sure what this will mean in practice, PwC has taken a draft view of what this may mean in terms of disclosure. Most importantly, I believe this new and not-yet well-defined requirement shifts the spotlight from physical assets to human beings. In the 1930s, the SEC required disclosure of physical assets, notably property, plant, and equipment (PPE), because 100% of enterprise value was derived from physical assets. Over time, the locus of value creation began to shift. In 1975, 83% of value came from physical assets and 17% from human capital. In 2016, the last time it was calculated, 84% of value came from human capital and 16% from physical assets. The shift tracks with our transition from a manufacturing (1st and 2nd industrial revolutions) to a knowledge-based economy. Even if we begin to manufacture physical products again in the US through additive manufacturing (3d printing), I suspect much of the value will be in the human ingenuity that inspires and designs those products. In short, the value is in the human capital.
In August 2019, the Business Roundtable declared an end to Milton Friedman’s Shareholder Value Era. The Business Roundtable acknowledged what Deloitte’s John Hagel discovered: this fifty-year period of prioritizing returning profits to shareholders actually resulted in lower rates of return on assets and, in fact, may have created economic stagnation as the stock market became increasingly disconnected from the actual economy.
Where Will We Go From Here?
As I have previously written, the pandemic accelerated our transformation to a digital economy, a transformation which is at its core a human transformation. Some economists, notably MIT’s Erik Byrnjolfsson, speculate that we may be adopting new technologies and processes now that set us up for a boom cycledriven by a productivity J- curve in the near future.
Whether we enjoy a new Roaring Twenties depends almost entirely on our prudent and broad investment in human capital. If we continue to advance technologically, if we continue to form new knowledge and identify new skills, if we continue to advance and adapt and bravely dive into the Skills Abyss, we may find ourselves in a very good, very human era marked by rapid and continuous progress and more fulfilled human workers.
Note: In the next installment, I will explain why the Human Capital Era mean, both morally and economically, we must invest in all humans.
Thank you to Chris Shipley for invaluable contributions to this piece.
World News || Latest News || U.S. News
Help us to become independent in PANDEMIC COVID-19. Contribute to diligent Authors.