contributed by:Human Asset Management to Avoid Capital Punishment
Christopher Surdak, President & CEO

Since the Industrial Revolution, Capitalist organizations have worked to combine the classic inputs of Capital, Resources and Labor as effectively as possible.  Those who got this mixture right produced better results and outperformed their competition. But, after 200 years of relentless improvement, has a fundamental error in management thinking set these same organizations up to fail? The recent slowdown in productivity growth may indicate that a reassessment of the role of people is not only timely, it may be imperative.

Human Asset Management

Humans aren’t resources.  They aren’t fungible.  They aren’t grain, or gas or pig iron.  One cannot be interchanged with another, or if they can be, you have the wrong ones in a digital, 3-D printing, just-in-time, appified world.

Humans also aren’t capital.  Indeed, they’re the very opposite.  Capital is something that you use as much as you can, as hard as you can, the same way, over and over, with the expectation of wearing it out, depreciating it as you go, and then replacing it once it has become obsolete, and its perceived value has declined to zero.

Instead of these outdated, wrong-minded views of labor, organizations have to start treating labor as unique inputs to their production.  Humans must be viewed as unique assets. If they aren’t then they will be liabilities.  They grow and appreciate over time, they add value and they enrich other inputs to a business process.  Or, they do the exact opposite.

Viewing humans as resources or capital made sense because it was easy and because we had to manage resources and capital in any event.  For 200 years, this error was not so egregious that we couldn’t make up for it, and so we did.  But in today’s world, where resources, capital and the Analog Trinity are all commoditized, the only lever we have left is human individuality, creativity and ingenuity.  That this occurred at the same time as our Human Resource- or Human Capital-approaches to managing people were also automated and digitally-enforced is decidedly inconvenient.

A Brief History of Capitalism

In our present, hyper-fast, just-in-time, click-here-to-buy world it’s often hard to imagine what life may have been like just a century ago.  In 1916 the world was embroiled in World War I, the first truly industrialized war.  The automobile, the airplane, the telephone, the lightbulb, electricity, penicillin and aluminum were only just finding their way into common use.  People were still wrapping their heads around break-through innovations like the zipper, plastics and instant coffee, while scientists at General Electric were trying to find a use for their new vacuum triode, the precursor to radio, television and other 20th Century miracles.

During the century before that, business people and scientists were simply struggling to understand the new idea of Capitalism. Frederick Taylor and other 19th century economists defined Capitalism as the creation of wealth through the combination of three key inputs: Resources, Capital and Labor.  Resources were raw materials such as minerals, crops, livestock and energy. They were things you grew, mined or harvested. Capital consisted of either enhanced resources that added value to raw materials, or the finances required to do so.  Capital was factories, machines, and money. Finally, labor was the muscle and brain power that used capital to turn resources into finished products.  Combine these three inputs in the right way, and wealth and power would result.

Taylor was aware that there was a fourth category of inputs to Capitalism: Intangibles. There were two types of intangibles: Organizational and Individual. Organizational intangibles were Bureaucracy, Processes and Rules, which I refer to as the Analog Trinity.  These three elements controlled how efficiently and effectively resources, capital and labor were combined.  In his time these were in their infancy.

Prior to 1900, simply accessing these three inputs and putting them together was a challenge. Vertical integration was the name of the game, and the robber barons of the late 19th Century were likely a necessary evil for getting Capitalism out of the starting blocks.

Individual intangibles including things like intelligence, creativity, experience and skill.  Taylor recognized that not all labor was equal.  Some workers were more effective than others, particularly when using old-world, craftsman-like approaches.  Prior to industrialization, different people created outputs of vastly different quality. Indeed, an individual might produce a phenomenal widget on Monday and then a horrible instance of the same widget the following Thursday. Humans are a many splendored thing, and often we are predictably unpredictable, and decidedly un-fungible.

Taylor and his contemporaries recognized this variability, and decided to use capital to eliminate it.  By making the inputs of people predictable, repeatable and quantified, Taylor worked to make labor more like raw material.  His goal was to make labor commoditized, fungible and interchangeable.  At this he was remarkably successful and so was born the notion of “Human Resources.”

The notion of “Human Resources” was purposefully dehumanizing.  The idea was that individuality, skill and experience were bad things, and they needed to be eliminated in order to make labor easier to use, and easier to replace.  This notion that people were interchangeable, combined with the mass migration of people escaping war, famine and civil disorder in Europe and Asia at this time led to the collapse of wages and the open hostility between “management” and “Labor” in the late 1800’s.  Indeed, this dehumanization of labor led directly to the creation and growth of capitalism’s nemesis, Communism, at that same time in history.

Losing for Winning

Economic, technological and social advances spurred on by two world wars drove advancement in humanity’s access to the three basic inputs to capitalism. The industrial demands of these wars forced organizations to address issues of maximizing the effective and efficient use of Capital, Raw Materials and Labor.  By the 1930s the notion of humans as resources had finally crystallized in capitalist thinking and this, combined with easier access to capital and raw materials, dramatically increased economic growth after the Great Depression. While World War II began as a war between ideologies it ended as a war of production.  By mobilizing an ‘’arsenal of democracy’ the allies secured victory and set the stage for a half-century of consumerism and economic growth.

In this post war world access to the three basic inputs to capitalist production were secure and their combination was well understood. Advances in global logistics meant that raw materials were readily accessible.  The war lifted the global economy out of its prior depression, and capital was once again available.  Finally, millions of soldiers returned to their domestic pool of labor, adding to the millions of women who had entered the workforce as men were deployed to war. Suddenly, all three of the basic inputs to capitalism were in ample supply, just as war-torn countries began rebuilding their infrastructure and discharged soldiers began marrying and starting families.

Those people left the military, where there was a clear class distinction between officers and soldiers, and entered the workplace, where there was a similar distinction between labor and management.  Managers were considered less fungible than laborers and they were treated that way, exactly like officers in the military. Workers may not have liked being treated as mere “Human Resources” but at least they were making living wages and were no longer dodging bullets.

The Analog Trinity Comes to the Fore

At this point, organizations changed their competitive focus to the less-tangible factors of the Analog Trinity.  Competitive advantage came not from just accessing these inputs, it now mattered how well you combined them.  Business operations became a key differentiator between one company and the next.  Building the best combination of bureaucracy, process and rules separated the winners from the losers. Businesses worked hard to improve these intangible inputs, a process that was dramatically accelerated by the introduction of Information Technology.

For half a century, organizations implemented ever-more-powerful information technology in order to automate their Analog Trinities.  Companies embraced a range of IT tools to improve how they used Resources, Capital and Labor in order to produce outcomes. By the 1990’s most companies were deploying tools we have all heard of, like ERP, CRM, SCM, etc.

The application of information technology to the intangibles of production, the Analog Trinity, led to the enormous growth in productivity that society enjoyed through most of the last-half of the 20th Century. It also became the source of differentiation and competitiveness, at least for a time. Companies that automated their business management became more efficient and effective than their competitors and customers rewarded them with their dollars. Investing in and implementing enterprise-class IT solutions was extremely expensive, and disruptive. However, the benefits to companies were large enough to warrant this expense. As a result, business IT became the multi-trillion-dollar industry that we see today.

The use of information technology had several unintended consequences.  One of these was the creation of a new class of worker: The Knowledge Worker. Knowledge workers were programmers, analysts, and other semi-white-collar positions that hadn’t existed prior to the use of IT.  These workers definitely weren’t fungible like laborers, but they weren’t quite management, either.  This creation and growth of a new kind of worker catalyzed the growing adoption of a new view of Labor: “Human Capital.”

Human Capital: Wrong Again

Where Human Resources attempted to treat Labor like raw materials (commoditized and fungible), Human Capital attempted to treat Labor like capital.  Here, capital was something that was invested in, utilized as much as possible and then discarded once the costs of its maintenance exceeds the value it generates.  The Human Capital approach recognized that some people were more productive, creative or valuable than others, and that the organization needed to recognize this difference, and extract as much additional value from it as possible, as long as it was cost effective to do so. Human Capital also sounded more politically correct than Human Resources.  It recognized, at least a little, that some people might be less fungible than others.

Under a Human Resources view, people were completely interchangeable, and they were treated that way.  You were an employee only in the sense that your employer needed to keep track of you for tax purposes.  Before unionization, many employees worked day-to-day, never knowing if they’d have a job the following morning. Many workers showed up the next day to find that their job was taken by someone else willing to work for 5 percent less, or so they were told by their management. Their productivity was measured by how many hours they spent standing at their station, pushing out product. Beyond that, little else mattered.

Human Capital was only slightly better than this. Here, organizations recognized that it was possible for one worker to be better than another. In the world of IT, there was an enormous difference between a talented software developer and a novice, and organizations were forced to recognize this. Reluctantly, most organizations readjusted to the idea that talent, skill and experience mattered.  They were still lousy at measuring and rewarding these distinctions, but change takes time and effort.  By the early 21st Century, workers were being measured by the timeliness of their status reports, how many lines of code they wrote per week or how effectively they contributed to their tiger team.

This perspective, Humans as Capital, when combined with the improving productivity brought by information technology, led to the great labor culling of the 1980’s and 1990’s.  Organizations around the world saw their productivity soar just as the Baby Boom generation started to approach retirement age.  From a capital-centric view, these people were done.  They had gone through a lifetime of annual raises, they were struggling to adopt and use new tools and technologies in new ways, and their prior experience at managing the Analog Trinity was no longer valued.  Hence, hundreds of thousands of older workers were pushed into early retirement by organizations that viewed them as capital: Costly to maintain, fully-depreciated, and past their useful life.

Human Capital may have sounded better than Human Resources, but the end result was still the same.

The Death of Competition?

Through the 1990’s IT business tools matured to the point that each category had one or two players who were “world class.” For example, every company used either SAP, Oracle or Peoplesoft for ERP.  There were a few hangers-on in each segment, but by 2000 the market for enterprise information tools had consolidated around a handful of players.

These players sold themselves on the idea that each was “world class.”  If you wanted to have the best Analog Trinity, you needed to use their software. Soon, everyone was, and everyone became “world class” at automating their Bureaucracy, Processes and Rules. This actually facilitated the other enormous business trend of the time: offshoring. Once your business processes were automated, they could be performed anywhere, by anyone.  Since workers were relatively expensive in the U.S. and Europe, replacing these people with workers in lower-cost countries was an easy way of reducing costs.  This only worked because enterprise information technology automated the Analog Trinity, and because organizations could find appropriate replacement labor in cheaper markets because of their new Human Capital metrics.

While costs dropped and productivity grew through this transition, there was a problem.  Soon, everyone was using the same software to automate their Analog Trinity, and to make them world class.  But, this homogeneity in approach meant that everyone had the same intangible inputs to their production.  This source of competitive differentiation was gone, as companies all performed their tasks pretty much the same as everyone else, and did so with the same inputs. The wave of business automation that swept organizations through the 1990’s started to make the Analog Trinity fungible.  ERP, CRM and SCM made everyone equally-efficient, and so differentiation became even more difficult.

Outsourcing: The Final Nail in the Coffin

What REALLY made the Analog Trinity fungible is outsourcing.  From the mid-1990’s through today, companies outsourced more and more of the elements of their Analog Trinity in an effort to reduce their costs.  First to go was the labor in their bureaucracies as this was the most obvious cost.

Once an organization gave part or all of its bureaucracy, processes and rules to some vendor, and the vendor runs it like they do for everyone else, the results truly ARE fungible.  Indeed, that was the point of outsourcing it:  efficiency, and economies of scale and scope. This homogeneity benefitted everyone and no one at the same time, and the Analog Trinity no longer differentiates. If these days, companies feel like it’s harder to find and to keep customers that’s because it is.  They have the same access to Resources, Capital and Labor as everyone else, and they put these together in the same way as everyone else.  As long as all of these inputs are the same, so too are their outputs. Competition is hard now not so much because of globalization, but rather because of homogenization.

The Fifth Element: Treating Labor as Labor

In our present world, differentiation is increasingly difficult, and increasingly imperative.  Capitalism has been so successful that most consumers expect perfect outcomes and the lowest possible price, instantaneously and without effort.  And, if you don’t give it to them, someone else will.  All of the inputs to production are now commoditized and differentiation is increasingly difficult.  Even the old standbys of marketing and advertising no longer work as consumers are carpet-bombed by messaging through their smart phones and apps.

There is another source of differentiation out there, and it is now coming to the fore.  It is exactly what Taylor and the other early capitalists despised, and it is exactly what 200 years of Human Resources and Human Capital tried to eliminate: Human thinking and creativity. The variability that Taylor despised is the last vestige of differentiation left for organizations to leverage. When every other input to your production is the same as everyone else’s, the individual skill, talent, experience and ability of your people are the only differentiations left.

This is bad news for people who view people as either resources or capital, because these perspectives discount or even discourage the very thing that could save them. If your existing tools and techniques for measuring the value of people strive for consistency, repeatability and homogeneity, then the people you retain and reward are those who are least different. This is exactly the opposite of what is required, now that all other inputs to your organization are fungible.

The people that you now employ may get their reports in on time, may follow all of your business rules to the letter and may respond to their emails with six-sigma predictability, but are they creative?  Do they generate new, different, innovative ideas?  Likely not.  Or at least not while you expect them to act like raw materials or capital. Almost every manager I have ever had has rationalized or apologized for these metrics. They acknowledged that they were utterly useless in determining a person’s actual value to the company. But in the absence of actually-useful metrics they needed to use something, and managers believed that such quantitative, capital-centric metrics were better than nothing. Arguably, they were horribly wrong in this view. If your organization measures and rewards people for pretending to be coal and corn or machines and money, then that’s likely all you will get from them.

Social Media, Big Data and the Intimacy Revolution

The topics of Social Media, Big Data and Artificial Intelligence (AI) are enormously popular. This is no accident.  In their struggle to compete, organizations have found that the avenues of success in the past are no longer available.  Simply improving quality or decreasing cost no longer seems to differentiate.  Once you’ve outsourced all of your Analog Trinity the only thing left to get rid of is yourself.  Executives tend to frown on this option.  As a result, businesses are looking for new ways to compete, and all of those revolve around human differentiation and uniqueness.

Social Media isn’t just a kid’s game.  It’s not about commenting on funny pictures. Social Media is a window into our thoughts, feelings, desires and fears.  People don’t pour out their souls on social media figuratively, they do so literally. The data being generated by this interaction is immense, and Big Data analytics is the result of sixty years of effort to digest and understand unstructured data, such as emails, videos and text, as well as we understand structured data, like what we manage in databases and spreadsheets.  Through Big Data, our analytic techniques have finally advanced to the point that understanding peoples’ thinking patterns is now possible.  Couple this with the immense volume of information now at our disposal, and developing a deep understanding of each and every individual on the planet is increasingly becoming a reality.

In today’s world, it is now possible to KNOW if a given employee is an asset or a liability.  Which means, doing so is now the difference between success and failure. We can now see who actually gives good customer service, versus who simply gets customers off of the phone faster.  We can quantitatively assess who is productive, and who is merely a good follower of rules and metrics, resource- or capital-style.  As organizations start to explore this new frontier of analytics they are coming to a new conclusion: people who successfully mimicked raw material or capital aren’t terribly effective at adding value in a differentiated, human sort of way. In retrospect, this should not be a surprise.

Artificial Intelligence and the Renaissance of our Humanity

The other hot topic in business these days is Artificial Intelligence (AI), or Machine Learning (ML).  Here, technologists hope to replicate and eventually replace the very elements of our humanness that we tried to engineer out of our operations for two centuries.  Namely, creativity, variability and understanding.  It is ironic that we turn to technology to provide the very thing that is readily available in our existing pools of labor, it’s just hard for us to measure.

AI is enticing because it looks like a way to short-circuit our inability to properly understand and measure human cognition.  I would argue that this is a dangerous mindset. AI doesn’t eliminate the need to better understand and measure people, it does the exact opposite. Technology is just a lever.  It enhances human abilities.  If we use AI or ML to advance our businesses, we better leverage the best thinking from the best people. If we leverage stupid people, we’ll get stupider.

Leveraging Artificial Intelligence and Machine Learning will demand that we understand our humanness better than the current state of the art.  This is not a technical challenge, it is a political and social challenge.  Before AI can have a significant impact on our productivity we must first change our organizations so that they measure, understand, value and reward human contributions to productivity, as distinct from capital, resources and the Analog Trinity.

All of this points to the need to manage people and their value in an entirely different way.  Rather than Human Resources or Human Capital, we must start to manage Humans as Assets.  If we do not, they will surely be a liability in a world of automated, click-to-accept, predictively shipped gratification, run with mathematical precision and little, if any, humanity.

The Answer Lies Within

Organizations that do so stand to benefit greatly, independent of their use of AI, simply because human creativity is the only variable left in the equation of productivity.  Organizations that embrace the notion of humans as assets, rather than resources or capital, will become more responsive, more creative, more innovative and ultimately more successful. If they do this along with embracing AI or ML, they will become unstoppable.

This will necessarily be difficult, because it flies in the face of two centuries of dogma surrounding how we value people and their inputs. Organizations many not want to face this reality because doing so will be hard. Most things worth doing are hard. The rewards will be there for those who invest in this change.  Those who do not will hopefully find comfort in knowing that as long as they devalue those traits that make us human, they themselves will have little value, too.

Save

Save

Save