0

Category: Enterprise

Human Asset Management to Avoid Capital Punishment

contributed by:Human Asset Management to Avoid Capital Punishment
Christopher Surdak, President & CEO

Since the Industrial Revolution, Capitalist organizations have worked to combine the classic inputs of Capital, Resources and Labor as effectively as possible.  Those who got this mixture right produced better results and outperformed their competition. But, after 200 years of relentless improvement, has a fundamental error in management thinking set these same organizations up to fail? The recent slowdown in productivity growth may indicate that a reassessment of the role of people is not only timely, it may be imperative.

Human Asset Management

Humans aren’t resources.  They aren’t fungible.  They aren’t grain, or gas or pig iron.  One cannot be interchanged with another, or if they can be, you have the wrong ones in a digital, 3-D printing, just-in-time, appified world.

Humans also aren’t capital.  Indeed, they’re the very opposite.  Capital is something that you use as much as you can, as hard as you can, the same way, over and over, with the expectation of wearing it out, depreciating it as you go, and then replacing it once it has become obsolete, and its perceived value has declined to zero.

Instead of these outdated, wrong-minded views of labor, organizations have to start treating labor as unique inputs to their production.  Humans must be viewed as unique assets. If they aren’t then they will be liabilities.  They grow and appreciate over time, they add value and they enrich other inputs to a business process.  Or, they do the exact opposite.

Viewing humans as resources or capital made sense because it was easy and because we had to manage resources and capital in any event.  For 200 years, this error was not so egregious that we couldn’t make up for it, and so we did.  But in today’s world, where resources, capital and the Analog Trinity are all commoditized, the only lever we have left is human individuality, creativity and ingenuity.  That this occurred at the same time as our Human Resource- or Human Capital-approaches to managing people were also automated and digitally-enforced is decidedly inconvenient.

A Brief History of Capitalism

In our present, hyper-fast, just-in-time, click-here-to-buy world it’s often hard to imagine what life may have been like just a century ago.  In 1916 the world was embroiled in World War I, the first truly industrialized war.  The automobile, the airplane, the telephone, the lightbulb, electricity, penicillin and aluminum were only just finding their way into common use.  People were still wrapping their heads around break-through innovations like the zipper, plastics and instant coffee, while scientists at General Electric were trying to find a use for their new vacuum triode, the precursor to radio, television and other 20th Century miracles.

During the century before that, business people and scientists were simply struggling to understand the new idea of Capitalism. Frederick Taylor and other 19th century economists defined Capitalism as the creation of wealth through the combination of three key inputs: Resources, Capital and Labor.  Resources were raw materials such as minerals, crops, livestock and energy. They were things you grew, mined or harvested. Capital consisted of either enhanced resources that added value to raw materials, or the finances required to do so.  Capital was factories, machines, and money. Finally, labor was the muscle and brain power that used capital to turn resources into finished products.  Combine these three inputs in the right way, and wealth and power would result.

Taylor was aware that there was a fourth category of inputs to Capitalism: Intangibles. There were two types of intangibles: Organizational and Individual. Organizational intangibles were Bureaucracy, Processes and Rules, which I refer to as the Analog Trinity.  These three elements controlled how efficiently and effectively resources, capital and labor were combined.  In his time these were in their infancy.

Prior to 1900, simply accessing these three inputs and putting them together was a challenge. Vertical integration was the name of the game, and the robber barons of the late 19th Century were likely a necessary evil for getting Capitalism out of the starting blocks.

Individual intangibles including things like intelligence, creativity, experience and skill.  Taylor recognized that not all labor was equal.  Some workers were more effective than others, particularly when using old-world, craftsman-like approaches.  Prior to industrialization, different people created outputs of vastly different quality. Indeed, an individual might produce a phenomenal widget on Monday and then a horrible instance of the same widget the following Thursday. Humans are a many splendored thing, and often we are predictably unpredictable, and decidedly un-fungible.

Taylor and his contemporaries recognized this variability, and decided to use capital to eliminate it.  By making the inputs of people predictable, repeatable and quantified, Taylor worked to make labor more like raw material.  His goal was to make labor commoditized, fungible and interchangeable.  At this he was remarkably successful and so was born the notion of “Human Resources.”

The notion of “Human Resources” was purposefully dehumanizing.  The idea was that individuality, skill and experience were bad things, and they needed to be eliminated in order to make labor easier to use, and easier to replace.  This notion that people were interchangeable, combined with the mass migration of people escaping war, famine and civil disorder in Europe and Asia at this time led to the collapse of wages and the open hostility between “management” and “Labor” in the late 1800’s.  Indeed, this dehumanization of labor led directly to the creation and growth of capitalism’s nemesis, Communism, at that same time in history.

Losing for Winning

Economic, technological and social advances spurred on by two world wars drove advancement in humanity’s access to the three basic inputs to capitalism. The industrial demands of these wars forced organizations to address issues of maximizing the effective and efficient use of Capital, Raw Materials and Labor.  By the 1930s the notion of humans as resources had finally crystallized in capitalist thinking and this, combined with easier access to capital and raw materials, dramatically increased economic growth after the Great Depression. While World War II began as a war between ideologies it ended as a war of production.  By mobilizing an ‘’arsenal of democracy’ the allies secured victory and set the stage for a half-century of consumerism and economic growth.

In this post war world access to the three basic inputs to capitalist production were secure and their combination was well understood. Advances in global logistics meant that raw materials were readily accessible.  The war lifted the global economy out of its prior depression, and capital was once again available.  Finally, millions of soldiers returned to their domestic pool of labor, adding to the millions of women who had entered the workforce as men were deployed to war. Suddenly, all three of the basic inputs to capitalism were in ample supply, just as war-torn countries began rebuilding their infrastructure and discharged soldiers began marrying and starting families.

Those people left the military, where there was a clear class distinction between officers and soldiers, and entered the workplace, where there was a similar distinction between labor and management.  Managers were considered less fungible than laborers and they were treated that way, exactly like officers in the military. Workers may not have liked being treated as mere “Human Resources” but at least they were making living wages and were no longer dodging bullets.

The Analog Trinity Comes to the Fore

At this point, organizations changed their competitive focus to the less-tangible factors of the Analog Trinity.  Competitive advantage came not from just accessing these inputs, it now mattered how well you combined them.  Business operations became a key differentiator between one company and the next.  Building the best combination of bureaucracy, process and rules separated the winners from the losers. Businesses worked hard to improve these intangible inputs, a process that was dramatically accelerated by the introduction of Information Technology.

For half a century, organizations implemented ever-more-powerful information technology in order to automate their Analog Trinities.  Companies embraced a range of IT tools to improve how they used Resources, Capital and Labor in order to produce outcomes. By the 1990’s most companies were deploying tools we have all heard of, like ERP, CRM, SCM, etc.

The application of information technology to the intangibles of production, the Analog Trinity, led to the enormous growth in productivity that society enjoyed through most of the last-half of the 20th Century. It also became the source of differentiation and competitiveness, at least for a time. Companies that automated their business management became more efficient and effective than their competitors and customers rewarded them with their dollars. Investing in and implementing enterprise-class IT solutions was extremely expensive, and disruptive. However, the benefits to companies were large enough to warrant this expense. As a result, business IT became the multi-trillion-dollar industry that we see today.

The use of information technology had several unintended consequences.  One of these was the creation of a new class of worker: The Knowledge Worker. Knowledge workers were programmers, analysts, and other semi-white-collar positions that hadn’t existed prior to the use of IT.  These workers definitely weren’t fungible like laborers, but they weren’t quite management, either.  This creation and growth of a new kind of worker catalyzed the growing adoption of a new view of Labor: “Human Capital.”

Human Capital: Wrong Again

Where Human Resources attempted to treat Labor like raw materials (commoditized and fungible), Human Capital attempted to treat Labor like capital.  Here, capital was something that was invested in, utilized as much as possible and then discarded once the costs of its maintenance exceeds the value it generates.  The Human Capital approach recognized that some people were more productive, creative or valuable than others, and that the organization needed to recognize this difference, and extract as much additional value from it as possible, as long as it was cost effective to do so. Human Capital also sounded more politically correct than Human Resources.  It recognized, at least a little, that some people might be less fungible than others.

Under a Human Resources view, people were completely interchangeable, and they were treated that way.  You were an employee only in the sense that your employer needed to keep track of you for tax purposes.  Before unionization, many employees worked day-to-day, never knowing if they’d have a job the following morning. Many workers showed up the next day to find that their job was taken by someone else willing to work for 5 percent less, or so they were told by their management. Their productivity was measured by how many hours they spent standing at their station, pushing out product. Beyond that, little else mattered.

Human Capital was only slightly better than this. Here, organizations recognized that it was possible for one worker to be better than another. In the world of IT, there was an enormous difference between a talented software developer and a novice, and organizations were forced to recognize this. Reluctantly, most organizations readjusted to the idea that talent, skill and experience mattered.  They were still lousy at measuring and rewarding these distinctions, but change takes time and effort.  By the early 21st Century, workers were being measured by the timeliness of their status reports, how many lines of code they wrote per week or how effectively they contributed to their tiger team.

This perspective, Humans as Capital, when combined with the improving productivity brought by information technology, led to the great labor culling of the 1980’s and 1990’s.  Organizations around the world saw their productivity soar just as the Baby Boom generation started to approach retirement age.  From a capital-centric view, these people were done.  They had gone through a lifetime of annual raises, they were struggling to adopt and use new tools and technologies in new ways, and their prior experience at managing the Analog Trinity was no longer valued.  Hence, hundreds of thousands of older workers were pushed into early retirement by organizations that viewed them as capital: Costly to maintain, fully-depreciated, and past their useful life.

Human Capital may have sounded better than Human Resources, but the end result was still the same.

The Death of Competition?

Through the 1990’s IT business tools matured to the point that each category had one or two players who were “world class.” For example, every company used either SAP, Oracle or Peoplesoft for ERP.  There were a few hangers-on in each segment, but by 2000 the market for enterprise information tools had consolidated around a handful of players.

These players sold themselves on the idea that each was “world class.”  If you wanted to have the best Analog Trinity, you needed to use their software. Soon, everyone was, and everyone became “world class” at automating their Bureaucracy, Processes and Rules. This actually facilitated the other enormous business trend of the time: offshoring. Once your business processes were automated, they could be performed anywhere, by anyone.  Since workers were relatively expensive in the U.S. and Europe, replacing these people with workers in lower-cost countries was an easy way of reducing costs.  This only worked because enterprise information technology automated the Analog Trinity, and because organizations could find appropriate replacement labor in cheaper markets because of their new Human Capital metrics.

While costs dropped and productivity grew through this transition, there was a problem.  Soon, everyone was using the same software to automate their Analog Trinity, and to make them world class.  But, this homogeneity in approach meant that everyone had the same intangible inputs to their production.  This source of competitive differentiation was gone, as companies all performed their tasks pretty much the same as everyone else, and did so with the same inputs. The wave of business automation that swept organizations through the 1990’s started to make the Analog Trinity fungible.  ERP, CRM and SCM made everyone equally-efficient, and so differentiation became even more difficult.

Outsourcing: The Final Nail in the Coffin

What REALLY made the Analog Trinity fungible is outsourcing.  From the mid-1990’s through today, companies outsourced more and more of the elements of their Analog Trinity in an effort to reduce their costs.  First to go was the labor in their bureaucracies as this was the most obvious cost.

Once an organization gave part or all of its bureaucracy, processes and rules to some vendor, and the vendor runs it like they do for everyone else, the results truly ARE fungible.  Indeed, that was the point of outsourcing it:  efficiency, and economies of scale and scope. This homogeneity benefitted everyone and no one at the same time, and the Analog Trinity no longer differentiates. If these days, companies feel like it’s harder to find and to keep customers that’s because it is.  They have the same access to Resources, Capital and Labor as everyone else, and they put these together in the same way as everyone else.  As long as all of these inputs are the same, so too are their outputs. Competition is hard now not so much because of globalization, but rather because of homogenization.

The Fifth Element: Treating Labor as Labor

In our present world, differentiation is increasingly difficult, and increasingly imperative.  Capitalism has been so successful that most consumers expect perfect outcomes and the lowest possible price, instantaneously and without effort.  And, if you don’t give it to them, someone else will.  All of the inputs to production are now commoditized and differentiation is increasingly difficult.  Even the old standbys of marketing and advertising no longer work as consumers are carpet-bombed by messaging through their smart phones and apps.

There is another source of differentiation out there, and it is now coming to the fore.  It is exactly what Taylor and the other early capitalists despised, and it is exactly what 200 years of Human Resources and Human Capital tried to eliminate: Human thinking and creativity. The variability that Taylor despised is the last vestige of differentiation left for organizations to leverage. When every other input to your production is the same as everyone else’s, the individual skill, talent, experience and ability of your people are the only differentiations left.

This is bad news for people who view people as either resources or capital, because these perspectives discount or even discourage the very thing that could save them. If your existing tools and techniques for measuring the value of people strive for consistency, repeatability and homogeneity, then the people you retain and reward are those who are least different. This is exactly the opposite of what is required, now that all other inputs to your organization are fungible.

The people that you now employ may get their reports in on time, may follow all of your business rules to the letter and may respond to their emails with six-sigma predictability, but are they creative?  Do they generate new, different, innovative ideas?  Likely not.  Or at least not while you expect them to act like raw materials or capital. Almost every manager I have ever had has rationalized or apologized for these metrics. They acknowledged that they were utterly useless in determining a person’s actual value to the company. But in the absence of actually-useful metrics they needed to use something, and managers believed that such quantitative, capital-centric metrics were better than nothing. Arguably, they were horribly wrong in this view. If your organization measures and rewards people for pretending to be coal and corn or machines and money, then that’s likely all you will get from them.

Social Media, Big Data and the Intimacy Revolution

The topics of Social Media, Big Data and Artificial Intelligence (AI) are enormously popular. This is no accident.  In their struggle to compete, organizations have found that the avenues of success in the past are no longer available.  Simply improving quality or decreasing cost no longer seems to differentiate.  Once you’ve outsourced all of your Analog Trinity the only thing left to get rid of is yourself.  Executives tend to frown on this option.  As a result, businesses are looking for new ways to compete, and all of those revolve around human differentiation and uniqueness.

Social Media isn’t just a kid’s game.  It’s not about commenting on funny pictures. Social Media is a window into our thoughts, feelings, desires and fears.  People don’t pour out their souls on social media figuratively, they do so literally. The data being generated by this interaction is immense, and Big Data analytics is the result of sixty years of effort to digest and understand unstructured data, such as emails, videos and text, as well as we understand structured data, like what we manage in databases and spreadsheets.  Through Big Data, our analytic techniques have finally advanced to the point that understanding peoples’ thinking patterns is now possible.  Couple this with the immense volume of information now at our disposal, and developing a deep understanding of each and every individual on the planet is increasingly becoming a reality.

In today’s world, it is now possible to KNOW if a given employee is an asset or a liability.  Which means, doing so is now the difference between success and failure. We can now see who actually gives good customer service, versus who simply gets customers off of the phone faster.  We can quantitatively assess who is productive, and who is merely a good follower of rules and metrics, resource- or capital-style.  As organizations start to explore this new frontier of analytics they are coming to a new conclusion: people who successfully mimicked raw material or capital aren’t terribly effective at adding value in a differentiated, human sort of way. In retrospect, this should not be a surprise.

Artificial Intelligence and the Renaissance of our Humanity

The other hot topic in business these days is Artificial Intelligence (AI), or Machine Learning (ML).  Here, technologists hope to replicate and eventually replace the very elements of our humanness that we tried to engineer out of our operations for two centuries.  Namely, creativity, variability and understanding.  It is ironic that we turn to technology to provide the very thing that is readily available in our existing pools of labor, it’s just hard for us to measure.

AI is enticing because it looks like a way to short-circuit our inability to properly understand and measure human cognition.  I would argue that this is a dangerous mindset. AI doesn’t eliminate the need to better understand and measure people, it does the exact opposite. Technology is just a lever.  It enhances human abilities.  If we use AI or ML to advance our businesses, we better leverage the best thinking from the best people. If we leverage stupid people, we’ll get stupider.

Leveraging Artificial Intelligence and Machine Learning will demand that we understand our humanness better than the current state of the art.  This is not a technical challenge, it is a political and social challenge.  Before AI can have a significant impact on our productivity we must first change our organizations so that they measure, understand, value and reward human contributions to productivity, as distinct from capital, resources and the Analog Trinity.

All of this points to the need to manage people and their value in an entirely different way.  Rather than Human Resources or Human Capital, we must start to manage Humans as Assets.  If we do not, they will surely be a liability in a world of automated, click-to-accept, predictively shipped gratification, run with mathematical precision and little, if any, humanity.

The Answer Lies Within

Organizations that do so stand to benefit greatly, independent of their use of AI, simply because human creativity is the only variable left in the equation of productivity.  Organizations that embrace the notion of humans as assets, rather than resources or capital, will become more responsive, more creative, more innovative and ultimately more successful. If they do this along with embracing AI or ML, they will become unstoppable.

This will necessarily be difficult, because it flies in the face of two centuries of dogma surrounding how we value people and their inputs. Organizations many not want to face this reality because doing so will be hard. Most things worth doing are hard. The rewards will be there for those who invest in this change.  Those who do not will hopefully find comfort in knowing that as long as they devalue those traits that make us human, they themselves will have little value, too.

Save

Save

Save

Is Big Data a Big Bummer?

contributed by:Big data is why I'm sad.
Christopher Surdak, President & CEO

As an engineer, science fiction movies are usually a hit with me. Nothing entertains me like a space opera with lasers, robots and Homeric heroes.  But when I’m asked which movie is my favorite some that make my short list might be a surprise.  The Shawshank Redemption is one such movie. Set in the mid 1900’s and based on a novella written by Stephen King, this movie is about a man who, by a tragic turn of events, found himself serving two life terms in prison for murders he didn’t commit.

Rather than letting this hopeless situation crush his spirit, he sets himself on a forty-year-long mission to get out.  It’s a story that shows how your situation is not your fate. Rather how you choose to deal with your situation determines your fate.  If you are honest with yourself about your situation, and deal with it with hope instead of despair, you might generate surprising results in the end.

A Prison of Habit?

I recently had a discussion with two executives from a large financial firm.  We covered a range of topics, both internal and external to their company, from negative interest rates to ISIL, and from re-engagement with Cuba to their troubles with connecting with Generation Z.  As hither-and-yon as these topics may be, they are interrelated, or at least that’s what all of my research over the last decade tells me. These disruptions to the world most of us grew accustomed to are appearing in all aspects of our lives.  To me, this synchronicity is no accident.

Over the last three years I’ve given over three hundred presentations on Big Data, Analytics and Organizational Change so I’ve seen a lot of reactions from a lot of audiences.  The reactions almost always follow the Seven Stages of Dealing with Loss. According to psychologists these seven stages are:

Shock
Denial
Anger
Bargaining
Depression
Testing
Acceptance

Shock

From my hundreds of presentations on Big Data, disruption, and so on, shock is the most typical response I have seen.  Those who are shocked usually just leave the room, mouths open, eyes wide as saucers, a slight stagger to their gait. I have noticed that it is not unusual for me to finish my presentation, ask if there are any questions and no one will volunteer one, at least not at first.

However, when I stick around after a presentation, people will invariably approach me about half an hour later fairly bursting with questions.  That’s a good indication of shock: it takes people a while to process what they just heard, digest it a bit, and only then can they respond with a million questions on what they just heard. I’m generally sure that they’ll survive their initial shock when they’re asking questions faster than I can answer them.

Shock comes in Big Data once people realize how pervasive these tools and techniques already are, how far behind most organizations really are, and how difficult it will be to catch up to those with a head start. If you are here, there is much work for you to do.

Denial

The bulk of executives I meet with go with Denial: Maybe all of that change is going on, but we’re immune to it because INSERT SOME RATIONALE OR OTHER HERE.  They’re convinced that because of some specialness in what they do, change won’t happen to them.  And over the years, I’ve heard them all:  We’re too big, we’ve been doing this too long, we’re too regulated, we’re too smart, we’re too simple, we’re too complex, we’re too capital-intensive, our customers love us too much, and so on and so on.  Denial is easy because it costs nothing, at least for a while.  And, it is remarkably self-assuring to tell yourself that through your hard work you’re too-something for the outside world to affect you. Your competitors love it when you buy into the notion that you’re some kind of special snowflake, because you’re making it easy for them to annihilate you while you sit in your comfortable bubble.

Denial is common because it is easy, it doesn’t require much thought or effort, and it’s really cheap, at least for a while.

Anger

A small percent in the audience responds with anger.  However, those who are angered by what I present rarely approach me to discuss their anger directly.  Instead they let it fly on their surveys after the event.  After all, it’s easy to get in someone’s face when you’re not actually in their face.  Anyone can be courageous through anonymity.  When this happens they’re usually attacking me on style, rather than content.  I was too brash, I wasn’t PC enough, the color of my Starbuck’s cup was offensive, whatever.  I generally feel that everyone is entitled to their own opinion.  Variety makes us stronger.  So, when I get this sort of response I usually write it off as another example of people who believe in being ‘open minded,’ as long as you do it their way.

Anger in Big Data comes from the notion that everything people have worked so hard to achieve for so long is somehow wrong.  That somehow, everyone that has worked at business or technology for the last fifty years is somehow incompetent or ignorant, and that the changes we now face should have been knowable decades ago.  Anger with this viewpoint is fair, because this assessment is anything but fair.  People weren’t incompetent for the last 60 years.  Rather, they were so successful that their entire game has now changed.

Bargaining

Bargaining is the next phase of grief management, where the mind tries to make a deal with the universe, attempting to get a better outcome through karmic barter.  This almost always manifests as someone saying, “I retire in X years, I just want things to stay the same until then so I don’t have to deal with it.”  I hear that one A LOT!  Bargaining maintains your subconscious’ need for the illusion of control, rather than accepting that in this instance you have none.  Loss of control is very disquieting, so trying to trade a smaller loss for the one that you’re facing is an obvious psychic ploy.

Bargaining is the typical strategy for technical people. They frequently skip past all of the emotional mumbo-jumbo, and want to get right to solving the problem.  Or at least, what they believe to be the problem. You see bargaining when technical people start to offer less-offensive solutions to the problem at hand.  Sometimes these might actually work.  But, more often, they are palliative detours designed to make people feel like they can choose the degree of change required by the situation at hand.

In Big Data, you see bargaining every time a CIO or CTO claims that a new platform is the solution to everything.  You see it every time a CMO or COO gets a technology budget of their own.  You see it every time a CEO introduces their company to their new “Chief Data Officer,” who has no budget, staff or mandate other than to make this ‘data stuff’ go away.

Depression

A relatively few go with Depression; we are doomed and there’s nothing we can do about it. I try to console them with, “There’s plenty that you can do, it just might not be what you WANT to do.” Generally, when someone is at this point they’ve already found clear indications that their world has changed, and the results of that change are becoming clear.  Customers may be leaving in increasing numbers, revenues are down, profitability is tanking, your best employees are leaving, and so on.  Or worse, some upstart has already shown up in your market and is eating your lunch, Uber-style.

Depression sets in once you realize that not only have things around you changed, but you yourself must also change if you’re going to survive.  Passively accepting change is hard enough.  Full-blow depression sets in when you realize that you must respond to outside change by actively changing yourself.  Your task just got at least twice as hard, and it’s a bummer, isn’t it?

In Big Data, depression comes when you realize that your hyper-expensive, 23 wonkabyte, 10,000 node Hadoop cluster didn’t magically solve every problem your business has ever had.  It comes when you realize that the “data scientist” you hired off LinkedIn didn’t have a PhD in statistics, they had a PhD in the Appreciation of Statistics, and a minor in creative writing. It comes when you realize that your new Blockchain marketplace is just as hackable as every other “hack-proof” technology that has ever come before, in the six millennia that people have been trying to make information “Hack-Proof.” And depression comes when you realize that everything that your own compliance department said that you couldn’t possibly do was just done by six college drop-outs living on a boat anchored 12.01 nautical miles off the coast of the United States.

As we enter what Gartner Group calls the “trough of disillusionment” with Big Data, expect to see a lot more depressed people wondering about wondering what to do next.

Testing

Testing is where we begin to explore the possibility that what has happened is not going to undo itself, and what might that reality entail. We attempt to see what our new world might look like, and try to see ourselves in it.  Our minds do this all the time.  When we chose our preferred narrative, our preferred response and our desired outcome we call it daydreaming. Day dreaming is fun, because we are in control.  Testing is not as much fun, because we don’t control the narrative, our response is what we hope is the best compromise, and the outcome is very much in doubt.  Nonetheless, reaching the point of testing is a good sign because you’re returning to your present self, and you’re preparing to deal with change rationally and productively.

In the arena of Big Data disruption, Testing starts to show itself when business leaders start to brainstorm how they could change, what those changes would entail, and what benefits might accrue.  If these thoughts are joined with a healthy dose of “and here’s the pain of not changing,” then there’s a much greater chance of success in whatever steps you choose.

Acceptance

Acceptance feels like a release, because it is.  It is a release of all of the doubt, hurt, fear and other negative energies that cause us to freeze in the first place.  There’s a freedom in finally acknowledging change.  There’s a clarity that comes from finally accepting that you must change, and that you have no alternative.  Making a choice, even if it is one you don’t want to make, frees up enormous psychological energy.  Rather than focusing on methods of prevention or escape, you focus on how to succeed with your new reality.

When I wrote this I thought to myself, “Is this true? What about the Shawshank Redemption” Andy Dufresne seemed to never succumb to his reality in prison.  Andy never accepted that he was a convict, and struggled to regain his freedom. Superficially, he never gave up hope that he would get out. But, look closer and you will find that Steven King had a deeper thesis than this. In the story, Andy does indeed go through the seven phases of dealing with loss (of his freedom).

At the end of the movie, when he is finally looking for the strength to follow his escape plan he only gets there through acceptance.  His last statement to his friend Red is, “Get busy living or get busy dying.”  This scared Red to no end, as he interpreted this to mean Andy would kill himself.  But, this was not the case.  Andy wasn’t ending his life, he was ending his denial and indecision over his situation.  He came to accept that if he didn’t make a risky choice, then he was doomed by the world around him.  What appeared in the movie to be defeat and resignation was actually acceptance.  And this acceptance finally gave Andy the courage to take control of his situation, and reclaim his life.

This movie, The Shawshank Redemption, was not a story about how people suffer from injustice, it was a story about Andy chose to fix injustice by acceptance and action.

Holistic Medicine for Your Data

Wherever you are in your process of dealing with Big Data-Induced Grief, we are here to help.  We know what you’re going through because we’ve gone through it, too, and we took good notes along the way.  Surdak & Company won’t keep you from getting scars.  After all, you are crossing a chasm filled with digital thorns and sharing-economy dragons.  But, most of us have already earned our first aid merit badges, and we are well-stocked with Bactine.

Save

Save

Save

Save

How Big Data Enhances Marketing

contributed by:Big data enhances marketing
Nathan Greenberg, Managing Director

For some, the process by which big data enhances marketing is obvious. For others, it takes some education. Regardless of your title, experience, or industry, you must understand one fundamental principle: marketing is not about selling. Neither is advertising about selling. At their cores, both activities are about forming, maintaining, and optimizing a connection with your customer. The way to best meet this goal is the same in business as it is in social interaction: understanding.

The Key to Understanding

Connections are stronger and last longer with deep understanding. They are emotional. The more you know about a person and their motivation, the better you can relate to them, empathize, sympathize, and share in their experiences. To buy something -even the decision to buy-  is an emotional experience. These emotion-driven actions are not only applicable to friendships or familial relationship. They form the strongest foundations for relationships between a company and its customers. If you get this right, you can achieve long-term customer engagement. Get this wrong, and you may never earn another shot.

Effective understanding for marketing purposes relies on the following six key elements:

  • WHO are they? Answer with detailed demographic information. The more details, the better the answer.
  • WHAT are they seeking? Answer with the solution they seek. For example, “a clean house” is a goal, while “a vacuum cleaner” is the solution to reach that goal. A vacuum cleaner under $400 with HEPA filters is a more detailed description of the solution they seek.
  • WHEN are they seeking? Answer with details about their timeline and yours. How long is their buying cycle? How long is your sales cycle? Where do they first intersect?
  • WHERE are they seeking? Answer with details about their geographic preferences: brick and mortar, online, both? How do they define their geo?
  • HOW are they seeking? Answer with details about their buying cycle. For example, what is their Zero Moment of Truth (ZMOT)? How many resources (referrals, ads, blogs, etc.) are at their disposal? What value do they place on each resource?

If generating the right answers to those questions doesn’t keep you busy for at least a couple hours, your answers are wrong, you don’t know enough about your customers, or you just don’t care. Or some combination of the three. Perhaps you believe you already know their answers. In the era of Big Data, believing that you know these things without having facts may be increasingly fatal to your business.

But I said six key elements. Have you already figured out what is missing?

The Importance of Why

“Why?” is the most important and complex question for successful marketing. It closes the loop of understanding, opens new opportunities, and forces new ideas. Also, context has trumped content as the dominant means of understanding your customers’ “why.”  For example, if you owned a printing company and Donald Trump bought stickers that said “Trump Pence” on Monday July 11, it would have seemed strange. Strange, until the “why” question was answered. At that moment you became one of the few people on the planet with early knowledge of his choice for Vice Presidential running mate…and you would likely have been sworn to secrecy.

“Why” mattered a great deal.

But that example barely scratches the surface of why “why” matters. The question has multiple facets. Let’s go deeper to explore more whys.

Why did he pick your printing company? Quality, price, location, great Facebook reviews? Knowing that answer serves as an input for future marketing decisions.
Why did he order them on July 11? Understanding his (and other customers’) buying cycle serves as an input for future marketing decisions.
Why did he search for a printing company instead of using an existing relationship? If you offered something his current shop did not, you have a competitive advantage that needs to be promoted.

The answer to “why” can help bring understanding or even better answers to “who”, “what”, “when”, etc. The unstructured data (“why”) that you may or may not have had previously, offers clarity and insight to the structured data you have been studying (hopefully) for so long. Unstructured data is your most valuable untapped resource.

What About Big Data and Marketing?

Success with Big Data is not about technology.  It’s not about creating the perfect data warehouse from the latest technology from the coolest vendor.  Instead, it is about doing things that you haven’t done before.  In particular, it is about analyzing unstructured data, which until recently was not readily available or analyzable.

The combined analysis of structured and unstructured data makes the difference. Analog companies of the last few decades were focused on structured data: quantifiable information such as demographics and sales volume. A few companies trying to be hip and edgy started looking at their unstructured data but failed to incorporate their previous structured data analysis or ignored it altogether. They must work with cohesively toward the same goal.

Big Data enhances marketing because it enables better understanding of your audience. Having huge amounts of information at your fingertips is useless if it is not put to use. Customer service ratings on Google, logistics analyses, retail in-store behavior studies, even employee reviews all offer real-world examples of “why”. They serve as a means of gathering data that can be measured, analyzed, and used as input for future marketing decisions.

The Courage to Follow Big Data

For Big Data to enhance marketing, it should challenge many of your current beliefs.  It will almost necessarily be scary. This data must be reviewed and then reflected against your goals. Where do you want to be tomorrow? Next year? Next decade? Big Data can help you reach those milestones or give you incontrovertible evidence that one or more of your goals is not achievable. You must have the courage to follow the data. Informed decisions are better than blindfolded throws at the dart board.

Think of your structured and unstructured data more as inputs than support. If all you seek is justification for your existing decision, you won’t be in business much longer. You are competing against leaders much more hungry for success than glory. Their goal is to reach the market, not force the market to reach them. With an effective analysis of your data you are empowering yourself and your team to meet an unmet need.

Save