Thursday, April 14, 2016

12 Attributes of a Great Leader

"A managers output is the output of the organization under her supervision or influence." - Andy Grove

I believe that most managers want to be great managers. In fact, many aspire to transcend management and to be deemed leaders. While there are countless books on the topic, sometimes they are too much theory and not enough practice, to be relevant and applicable. One of the main roles of a leader is to teach; through actions of commission, actions of omission, and through a thoughtful dialogue. The goal of this series is to share what I believe are the hallmarks of great management.

In High Output Management, Andy Grove explores why, at times, an individual is not able to achieve their potential in a job. He simplifies it to one of 2 reasons: 1) they are incapable, or 2) they are not motivated. In either case, it's the responsibility of the manager to assess and remediate the situation. This is not comfortable, nor easy. Hence, this is why great leadership is difficult.

I will focus on what I think are the 12 defining attributes of a great leader:

1) Team builder- assembling and motivating teams.
2) Running teams- a disciplined management system, based on thoughtful planning.
3) Expectations, Accountability, and Empowerment - the #1 issue I see is here.
4) Being on offense, not defense- leading instead of reacting.
5) Engagement and influence- creating informal influence broadly.
6) Operational rigor- managing the details, without micro-managing.
7) Clear and candid communication - never leaving a gray area.
8) Training- a critical role of a manager.
9) Mental toughness- never talked about enough, yet many managers fail due to this aspect alone.
10) Strategic thinking- having a point of view, differentiated and right.
11) Obsessing over clients- knowing who pays the bills and applying it to every decision.
12) Positive attitude- Motivating by example.

I'll cover each topic via blog and/or podcast.

"In classical times, when Cicero had finished speaking, the people said, 'How well he spoke', but when Demosthenes had finished speaking, they said, 'Let us march'"- Adlai Stevenson

Monday, March 21, 2016

Pattern Recognition

Elements of Success Rhyme

The science of pattern recognition has been explored for hundreds of years, with the primary goal of optimally extracting patterns from data or situations, and effectively separating one pattern from another. Applications of pattern recognition are found everywhere, whether it’s categorizing disease, predicting outbreaks of disease, identifying individuals (through face or speech recognition), or classifying data. In fact, pattern recognition is so ingrained in many things we do, we often forget that it’s a unique discipline which must be treated as such if we want to really benefit from it.

According to Tren Griffin, a prominent blogger and IT executive, Bruce Dunlevie, a general partner at the venture capital rm Benchmark Capital, once said to him, “Pattern recognition is an essential skill in venture capital.” Griffin elaborates the point Dunlevie was making that “while the elements of success in the venture business do not repeat themselves precisely, they often rhyme. In evaluating companies, the successful VC will often see something that reminds them of patterns they have seen before.” Practical application of pattern recognition for business value is difficult. The great investors have a keen understanding of how to identify and apply patterns.

Pattern Recognition: A Gift or a Trap?

Written in 2003 by William Gibson, Pattern Recognition (G.P. Putnam’s Sons) is a novel that explores the human desire to synthesize patterns in what is otherwise meaningless data and information. The book chronicles a global traveler, a marketing consultant, who has to unravel an Internet-based mystery. In the course of the book, Gibson implies that humans find patterns in many places, but that does not mean that they are always relevant. In one part of the book, a friend of the marketing consultant states, “Homo sapiens are about pattern recognition. Both a gift and a trap.” The implication is that humans find some level of comfort in discovering patterns in data or in most any medium, as it helps to explain what would otherwise seem to be a random occurrence. The trap comes into play when there is really not a pattern to be discovered because, in that case, humans will be inclined to discover one anyway, just for the psychological comfort that it affords.

Patterns are useful and meaningful only when they are valid. The bias that humans have to find patterns, even if patterns don’t exist, is an important phenomenon to recognize, as that knowledge can help to tame these natural biases.

Tsukiji Market

The seafood will start arriving at Tsukiji before four in the morning, so an interested observer must start her day quite early. The market will see 400 different species passing through on any given day, eventually making their way to street carts or the most prominent restaurants in Tokyo. The auction determines the destination of each delicacy. In any given year, the fish markets in Tokyo will handle over 700 metric tons of seafood, representing a value of nearly $6 billion.

The volume of species passing through Tsukiji represents an interesting challenge in organizing and classifying the catch of the day. In the 2001 book Pattern Classification (Wiley), Richard Duda provided an interesting view of this process, using fish as an example.

With a fairly rudimentary example — fish sorting — Duda is able to explain a number of key aspects of pattern recognition.

A worker in a fish market, Tsukiji or otherwise, faces the problem of sorting fish on a conveyor belt according to their species. This must happen over and over again, and must be done accurately to ensure quality. In Duda’s simple example in the book, it’s assumed that there are only two types of fish: sea bass and salmon.

As the fish come in on the conveyor belt, the worker must quickly determine and classify the fishes’ species.

There are many factors that can distinguish one type of fish from another. It could be the length, width, weight, number and shape of fins, size of head or eyes, and perhaps the overall body shape.
There are also a number of factors that could interrupt or negatively affect the process of distinguishing (sensing) one type from the other. These factors may include the lighting, the position of the fish on the conveyor belt, the steadiness of the photographer taking the picture, and so on.

The process, to ensure the most accurate determination, consists of capturing the image, isolating the fish, taking measurements, and making a decision. However, the process can be enhanced or complicated, based on the number of variables. If an expert fisherman indicates that a sea bass is longer than salmon, that’s an important data point, and length becomes a key feature to consider. However, a few data points will quickly demonstrate that while sea bass are longer than salmon on average, there are many examples where that does not hold true. Therefore, we cannot make an accurate determination of fish type based on that factor alone.

With the knowledge that length cannot be the sole feature considered, selecting additional features becomes critical. Multiple features — for example, width and lightness — start to give a higher- confidence view of the fish type.

Duda defines pattern recognition as the act of collecting raw data and taking an action based on the category of the pattern. Recognition is not an exact match. Instead, it’s an understanding of what is common, which can be expanded to conclude the factors that are repeatable.

A Method for Recognizing Patterns

Answering the three key questions (what is it?, where is it?, and how it is constructed?) seems straightforward — until there is a large, complex set of data to be put through that test. At that point, answering those questions is much more daunting. Like any difficult problem, this calls for a process or method to break it into smaller steps. In this case, the method can be as straightforward as five steps, leading to conclusions from raw inputs:

1. Data acquisition and sensing: The measurement and collection of physical variables.

2. Pre-processing: Extracting noise in data and starting to isolate patterns of interest. In the fish example given earlier in the chapter, you would isolate the fish from each other and from the background. Patterns are well separated and not overlapping.

3. Feature extraction: Finding a new representation in terms of features. For the fish, you would measure certain features.

4. Classification: Utilizing features and learned models to assign a pattern to a category. For the fish, you would clearly identify the key distinguishing features (length, weight, etc.).

5. Post-processing: Assessing the confidence of decisions, by leveraging other sources of information or context. Ultimately, this step allows the application of content-dependent information, which improves outcomes.

Pattern recognition techniques find application in many areas, from machine learning to statistics, from mathematics to computer science. The real challenge is practical application. And to apply these techniques, a framework is needed.

Elements of Success Rhyme (continued)

Pattern recognition can be a gift or a trap.

It’s a trap if a person is lulled into believing that history repeats itself and therefore there is simply a recipe to be followed. This is lazy thinking, which rarely leads to exceptional outcomes or insights.

On the other hand, it’s a gift to realize that, as mentioned in this chapter’s introduction, the elements of success rhyme. Said another way, there are commonalities between successful strategies in businesses or other settings. And the proper application of a framework or methodology to identify patterns and to understand what is a pattern and what is not can be very powerful.

The inherent bias within humans will seek patterns, even where patterns do not exist. Understanding a pattern versus the presence of a bias is a differentiator in the Data era. Indeed, big data provides a means of identifying statistically significant patterns in order to avoid these biases.

This post is adapted from the book, Big Data Revolution: What farmers, doctors, and insurance agents teach us about discovering big data patterns, Wiley, 2015. Find more on the web at

Thursday, February 25, 2016

Ubuntu: A New Way to Work

“Teamwork and intelligence wins championships.” — Michael Jordan

There was an anthropologist dispatched to Africa many years ago to study the lives and customs of local tribes. While each one is unique, they share many customs across the geographies and locations. The anthropologist tells a story of how one time he brought along a large basket of candy, which quickly got the attention of all the children in the tribe. Instead of just handing it out, he decided to play a game. He sat the basket of candy under a tree and gathered all of the children about 50 yards away from the tree. He informed them that they would have a race, and that the first child to get there could keep all of the candy to themselves. The children lined up, ready for the race. When the anthropologist said “Go”, he was surprised to see what happened: all of the children joined hands and moved towards the tree in unison. When they got there, they neatly divided up the candy and sat down to enjoy it together. When he questioned why they did this, the children responded, “Ubuntu. How could any of us be happy if all the others were sad.”

Nelson Mandela describes it well; “In Africa, there is a concept known as Ubuntu- the profound sense that we are human only through the humanity of others; that if we are to accomplish anything in this world it will in equal measure be due to the work and achievements of others.”


Read the rest on Medium.

Tuesday, February 16, 2016

Decentralized Analytics for a Complex World

In 2015, General Stan McChrystal published Team of Teams, New Rules of Engagement For a Complex World. It was the culmination of his experience in adapting to a world that had changed faster than the organization that he was responsible to lead. When he assumed command for the Joint Special Operations Task Force in 2003, he recognized that their typical approaches to communication were failing. The enemy was a decentralized network that could move very quickly and accordingly, none of his organizations traditional advantages (equipment, training etc) mattered.

He saw the need to re-organize his force as a network, combining transparent communication with decentralized decision-making authority. Said another way, decisions should be made at the lowest level possible, as quickly as possible, and then, and only then, should data flow back to a centralized point. Information silos were torn down and data flowed faster, as the organization became flatter and more flexible.

Observing that the world is changing faster than ever, McChrystal recognized that the endpoints were the most valuable and the place that most decision making should take place. This prompted the question:

What if you could combine the adaptability, agility, and cohesion of a small team with the power and resources of a giant organization?


As I work with organizations around the world, I see a similar problem to the one observed by General McChrystal: data and information are locked into an antiquated and centralized model. The impact is that the professionals in most organizations do not have the data they need, in the moment it is required, to make the optimal decision. Even worse, most investments around Big Data today are not addressing this problem, as they are primarily focused on reducing the cost of storage or simply augmenting traditional approaches data management. Enterprises are not moving along the Big Data maturity curve fast enough:

While its not life or death in most cases, the information crisis in organizations is reaching a peak. Companies have not had a decentralized approach to analytics, to complement their centralized architecture. Until now.


Today, we are announcing Quarks. An open source, lightweight, embedded streaming analytics runtime, designed for edge analytics. It can be embedded on a device, gateway, or really anywhere, to analyze events locally, on the edge. For the first time ever, analytics will be truly decentralized. This will shorten the window to insights, while reducing communication costs by only sending the relevant events back to a centralized location. What General McChrystal did to modernize complex field engagements, we are doing for analytics in the enterprise.

While many believe that the Internet of Things (IoT) may be over-hyped, I would assert the opposite; we are just starting to realize the enormous potential of a fully connected world. A few data points:

1) $1.7 trillion of value will be added to the global economy by IoT in 2019. (source: Business Insider)
2) The world will grow from 13 billion to 29 billion connected devices by 2020. (source: IDC)
3) 82% of enterprise decision makers say that IoT is strategic to their enterprise. (source: IDC)
4) While exabytes of IoT data are generated every day, 88% of it goes unused. (Source: IBM Research)

Despite this obvious opportunity, most enterprises are limited by the costs and time lag associated with transmitting data for centralized analysis. To compound the situation, data streams from IoT devices are complex, and there is little ability to reuse analytical programs. Lastly, 52% of developers working on IoT are concerned that existing tools do not meet their needs (source: Evans Data Corporation). Enter, the value of open source.


Quarks is a programming model and runtime for analytics at the edge. It includes a programming SDK, a lightweight and embeddable runtime, and is open source (incubation proposal), available on GitHub.

This gives data engineers what they want:

− Easy access to IoT data streams
− Integrated data at rest with IoT data streams
− Curated IoT data streams
− The ability to make IoT data streams available for key stakeholders

This gives data developers what they want:
− Access to IoT data streams through APIs
− The ability to deploy machine learning, spatial temporal and other deep analytics on IoT data streams
− Familiar programming tools like Java or Python to work with IoT data streams
− The ability to analyze IoT data streams to build cognitive applications

Analytics at the edge is finally available to everyone, starting today, with Quarks. And, the use cases are extensive. For example, in 2015, Dimension Data became the official technology partner for the Tour de France, the worlds largest and most prestigious cycling race.

In support of their goal to revolutionize the viewing experience of billions of cycling fans across the globe, Dimension Data leveraged IBM Streams to analyze thousand of data points per second, from over 200 riders, across 21 days of cycling.

The potential of embedding Quarks in connected devices, on the network edge (essentially on each bike) would enable a new style of decentralized analytics: detecting in real-time, critical race events as they happen (a major rider crash for example), rather than having to infer these events from location and speed data alone. With the ability to analyze data at the end point, that data stream can then be integrated with Kafka, etc. and moved directly into Hadoop for storage or Spark for analytics. This will drive analytics at a never before seen velocity in enterprises.


We live in a world of increasing complexity and speed. As General McChrystal described, organizations that rely solely on centralized architectures for decision making and information flow will fail. At IBM, we are proud to lead the decentralization of analytics, complementing centralized architectures, as a basis for Cognitive Computing.

Friday, February 5, 2016

Gather the Fruit and Burn the Tree

There is an old story about a gentleman walking through the countryside and he comes upon a plum orchard. As he is walking through the orchard, he notices a plum tree with fruit that is ripe on the vine, but it has crashed into ruins on the ground. He starts to survey the tree to determine if it has collapsed due to the weight of the fruit or a recent storm. The farmer of the orchard walks up and promptly shows the man that insects have eaten through a good portion of the tree, causing its collapse. The man turns to the farmer and says, “Well, what do you do now?” The farmer replies, “It’s time to gather the fruit and burn the tree.”


Finding and cultivating the right mentors will change your life and career. While I have a number of people asking me to act as their mentor, I feel like I’m a pretty average mentor, when it comes to the particular task. One of the main reasons I agree to act as a mentor is because I learn a lot in the process. In many cases, it’s not me telling the mentee what they need to hear, it’s me saying what we both need to hear. One of my goals over the next 12 months is to become a better mentor, in order to really help people raise their own personal bar of achievement. I want to provide whatever ‘fruit’ is needed, and hopefully propel them onto whatever is next.

I believe that a mentor is fundamentally responsible for doing 4 things:

1) Inspire
2) Teach
3) Encourage
4) Positively effect

Listening is not good enough. Neither is simply giving advice. There has to be more, and I think doing 2 or more of the above, in any interaction, is a worthy goal.


In order to choose or act as a mentor, I think it takes more than understanding the responsibilities of the role; you also have to understand what makes a good mentor. I listened to a podcast recently that was talking about board members of public/private companies. The assertion was that a person is qualified as a board member (“quad-qualified”) if they have the following attributes:

1) Independence
2) Bandwidth
3) Motivation
4) Expertise

For anyone looking for a mentor, you should ensure that your mentor is “quad-qualified”:

1) Independence- if you work for them, they are not independent. If you work with them, they may not be independent. You need someone truly independent.
2) Bandwidth- In addition to their job, how many other people are they mentoring? If they don’t have the bandwidth to spend time with you, neither of you will get maximum value out of the relationship.
3) Motivation- Are they personally motivated in helping you out? Do you have a past together, such that they would personally invest in you? In my experience, “blind date mentorships” don’t work due to a lack of motivation.
4) Expertise- Do they work and live in a similar environment, such that they can provide relevant expertise? Or, do you intentionally need someone that has a different background/expertise?

If you are seeking a mentor, take the process seriously and ensure that they are quad-qualified. If not, you are probably both wasting your time.


I think the only way that I can really help as a mentor is to help others think about their specific situations/careers, through a different set of eyes. It may be helping them with skill development (encourage/train), it may be helping them see the bigger picture (inspire), or perhaps just sharing how I have approached similar situations (teach and positively effect). I typically try to help mentees in a few ways:

- Getting comfortable, being uncomfortable.
- Challenge them on their preparation: what they read, study, etc.
- Understand what they look forward to every day and how its relevant to success
- Prioritize: what’s the one thing you can do this week, such by doing it everything else would be easier?
- Apply Pareto to their to-do list (3 P’s, etc)
- Understand and apply the Rockefeller habits (Priorities, Rhythm, Data)
- Be on offense
- Career decisions
- Feedback loops
- Pre-mortems
- 7 measures of Good Enough
- Hill climbing
- Goal setting

One of the primary themes that I have noticed across most of my mentor/mentee relationships is that most people tend to overvalue the near term and undervalue long term awards. This is a shame, because I see people making sub-optimal decisions and they set off on the wrong hill. I try to remind people is that you can always trade up…but sometimes its best to defer the easy step and focus on your real goals.


Gather the Fruit and Burn the Tree. For a mentee, a mentor/mentee relationship is about absorbing whatever you can and then focusing on what’s next. While these can certainly be long term relationships, they don’t have to be; I don’t think any great mentor has that expectation. Everyone should cultivate mentors and mentees throughout their career; gather what they can, and focus on the next step.

Monday, December 21, 2015

Second-Level Thinking

Howard Marks is a well respected investor and the founder of Oaktree Capital Management. In a recent letter to investors, he introduced a concept that he calls 'Second-Level Thinking'. In his words:

This is a crucial subject that has to be understood by everyone who aspires to be a superior investor. Remember your goal in investing isn’t to earn average returns; you want to do better than average. Thus your thinking has to be better than that of others – both more powerful and at a higher level. Since others may be smart, well-informed and highly computerized, you must find an edge they don’t have. You must think of something they haven’t thought of, see things they miss, or bring insight they don’t possess. You have to react differently and behave differently. In short, being right may be a necessary condition for investment success, but it won’t be sufficient. You must be more right than others . . . which by definition means your thinking has to be different. . .

For your performance to diverge from the norm, your expectations have to diverge from the norm, and you have to be more right than the consensus. Different and better: that’s a pretty good description of second-level thinking.

Second-level thinking is deep, complex and convoluted.

Certainly, he sets a high mark for how to stretch our thinking.

In the context of the technology industry, I would use the following examples to contrast first-level and second-level thinking around building products:

First-level thinking says, “Clients are asking for this; this functionality will fill a need.” Second-level thinking says, “It’s something that our clients are asking for, but everyone is asking for that. Therefore, every competitor is pursuing that and its just a race to the finish and will quickly commoditize; let’s go in a different direction.”

First-level thinking says, “The IT analyst firms say this market will have low growth and most companies already have the capability. Let’s focus on a different market.” Second-level thinking says, “The outlook stinks, but everyone else is abandoning it. We could reinvent how clients are consuming in this critical area. Double down!”

These are rudimentary and simple, but hopefully sufficient examples for how Second-Level Thinking may apply in the technology industry.


Market Forces at Work

We are in an unprecedented business cycle. Protracted low interest rates have discouraged saving, and therefore money is put to work. At the same time, the rise of activist investors has altered traditional approaches to capital allocation. Public companies are being pushed to monetize their shareholders investments, either in the form of dividends or buybacks (and most often both). Because of this non-relenting pressure on public companies, investment has begun to flow more drastically towards private enterprises (at later and later stages), leading to the 'unicorn' phenomena. These 'unicorn' companies, which have the time and resources in their current form, are doing 3 things:

1) Paying anything for talent, causing wage inflation for engineers and some other roles.
2) Attempting to re-invent many industries, by applying technology and in many cases, shifting them to a pay-as-you-go (or as-a-service) model.
3) Spending aggressively, in any form necessary, to drive growth.

Public companies, in some cases, are crowded-out of the investments they would normally make, given this landscape. But, a central truth remains: at some point, an enterprise must make money. That timeline is typically compressed when capital begins to dry up. The term 'unicorn' was first used to connote something that is rarely seen. The fact that they are now on every street corner is perhaps an indication that time is short.


The Impact

1) "Winter is coming" for the engineering wage cycle. Currently, this inflation is driven in part by supply/demand but more so by the cult of "free money" and nothing else better to do with it. At some point, when 'hire at any cost' dissipates, we will know who has truly built differentiated skills.

2) The rise of cloud and data science will eliminate 50% of traditional IT jobs over the next decade. Read more here. The great re-skilling must start now, for companies that want to lead in the data era. Try this.

3) As-a-service is a cyclical change (not secular). The length of the cycle is anyones guess. And, as with most cycles, it will probably last longer and end faster, than most people believe. Much of this cycle is driven by the market forces described above (less money for capex, since all of it is being spent on buybacks/dividends). At some point, companies will realize that 'paying more in perpetuity' is not a good idea, and there will be a Reversion to the Mean.

4) Centralized computing architectures (cloud) will eventually diminish in importance. Right now, we are in a datacenter capital arms race, much like the Telco's were in 1999. But, as edge devices (smartphones, IoT, etc.) continue to advance and the world is blanketed with super computers, there will be less of a need for a centralized processing center.

5) Machine Learning is the new 'Intel inside'. This will become a default capability in every product/device, instrumenting business processes and decision making. This will put even more pressure on the traditional definition of roles in an organization.

6)There is now general agreement that data is a strategic asset. Because of this, many IT and Cloud providers are seeking to capture data, under the notion that 'data has gravity'. Once it is captured, the belief goes, it is hard to move, and therefore can be monetized. While I understand that in concept, its not very user centric. Who likes having their data trapped? No one. Therefore, I believe the real winners in this next cycle will be those that can enable open and decentralized data access. This is effectively the opposite of capturing it. It's enabling a transparent and open architecture, with the ability to analyze and drive insights from anywhere. Yet another reason to believe in Spark.


It's debatable if the 6 impacts above represent Second-Level Thinking. While they may to some extent, the real thinking would be to flesh out the implications of each, and place bets on the implications. These are bets that could be made in the form of financial investments, product investments, or "start a new company" investments.

Monday, December 14, 2015

Transforming Customer Relationships with Data


A friend walked into a bank in a small town in Connecticut. As frequently portrayed in movies, the benefit of living in a small town is that you see many people that you know around town and often have a first name relationship with local merchants. It’s very personal and something that many equate to the New England charm of a town like New Canaan. As this friend, let us call him Dan, entered the bank, it was the normal greetings by name, discussion of the recent town fair, and a brief reflection on the weekend’s Little League games.

Dan was in the market for a home. Having lived in the town for over ten years, he wanted to upsize a bit, given that his family was now 20-percent larger than when he bought the original home. After a few months of monitoring the real estate listings, working with a local agent (whom he knew from his first home purchase), Dan and his wife settled on the ideal house for their next home. Dan’s trip to the bank was all business, as he needed a mortgage (much smaller than the one on his original home) to finance the purchase of the new home.

The interaction started as you may expect: “Dan, we need you to fill out some paperwork for us and we’ll be able to help you.” Dan proceeded to write down everything that the bank already knew about him: his name, address, Social Security number, date of birth, employment history, previous mortgage experience, income level, and estimated net worth. There was nothing unusual about the questions except for the fact that the bank already knew everything they were asking about.

After he finished the paperwork, it shifted to an interview, and the bank representative began to ask some qualitative questions about Dan’s situation and needs, and the mortgage type that he was looking for. The ever-increasing number of choices varied based on fixed versus variable interest rate, duration and amount of the loan, and other factors.

Approximately 60 minutes later, Dan exited the bank, uncertain of whether or not he would receive the loan. The bank knew Dan. The bank employees knew his wife and children by name, and they had seen all of his deposits and withdrawals over a ten-year period. They’d seen him make all of his mortgage payments on time. Yet the bank refused to acknowledge, through their actions, that they actually knew him.


There was an era when customer support and service was dictated by what you told the person in front of you, whether that person was a storeowner, lender, or even an automotive dealer. It was then up to that person to make a judgment on your issue and either fix it or explain why it could not be fixed. That simpler time created a higher level of personal touch in the process, but then the telephone came along. The phone led to the emergence of call centers, which led to phone tree technology, which resulted in the decline in customer service.


While technology has advanced exponentially since the 1800s, customer experience has not advanced as dramatically. While customer interaction has been streamlined and automated in many cases, it is debatable whether or not those cost-focused activities have engendered customer loyalty, which should be the ultimate goal.

The following list identifies the main historical influences on customer service. Each era has seen technological advances and along with that, enhanced interaction with customers.

Pre-1870: In this era, customer interaction was a face-to-face experience. If a customer had an issue, he would go directly to the merchant and explain the situation. While this is not scientific, it seems that overall customer satisfaction was higher in this era than others for the simple fact that people treat a person in front of them with more care and attention than they would a person once or twice removed.

1876: The telephone is invented. While the telephone did not replace the face-to-face era immediately, it laid the groundwork for a revolution that would continue until the next major revolution: the Internet.

1890s: The telephone switchboard was invented. Originally, phones worked only point-to-point, which is why phones were sold in pairs. The invention of the switchboard opened up the ability to communicate one-to-many. This meant that customers could dial a switchboard and then be directly connected to the merchant they purchased from or to their local bank.

1960s: Call centers first emerged in the 1960s, primarily a product of larger companies that saw a need to centralize a function to manage and solve customer inquiries. This was more cost effective than previous approaches, and perhaps more importantly, it enabled a company to train specialists to handle customer calls in a consistent manner. Touch-tone dialing (1963) and 1-800 numbers (1967) fed the productivity and usage of call centers.

1970s: Interactive Voice Response (IVR) technology was introduced into call centers to assist with routing and to o er the promise of better problem resolution. Technology for call routing and phone trees improved into the 1980s, but it is not something that ever engendered a positive experience.

1980s: For the first time, companies began to outsource the call-center function. The belief was that if you could pay someone else to offer this service and it would get done at a lower price, then it was better. While this did not pick up steam until the 1990s, this era marked the first big move to outsourcing, and particularly outsourcing overseas, to lower- cost locations.

1990s to present: This era, marked by the emergence of the Internet, has seen the most dramatic technology innovation, yet it’s debatable whether or not customer experience has improved at a comparable pace. e Internet brought help desks, live chat support, social media support, and the widespread use of customer relationship management (CRM) and call-center software.

Despite all of this progress and developing technology through the years, it still seems like something is missing. Even the personal, face-to-face channel (think about Dan and his local bank) is unable to appropriately service a customer that the employees know (but pretend not to, when it comes to making business decisions)
While we have seen considerable progress in customer support since the 1800s, the lack of data in those times prevented the intimate customer experience that many longed for. It’s educational to explore a couple pre-data era examples of customer service, to understand the strengths and limitations of customer service prior to the data era.


The United States entered World War I on April 6, 1917. The U.S. Navy quickly became interested in Boeing’s Model C seaplane. e seaplane was the rst “all-Boeing” design and utilized either single or dual pontoons for water landing. e seaplane promised agility and exibility, which were features that the Navy felt would be critical to managing the highly complex environment of a war zone. Since Boeing conducted all of the testing of the seaplane in Pensacola, Florida, this forced the company to deconstruct the planes, ship them to the west coast of the United States (by rail). en, in the process, they had to decide whether or not to send an engineer and pilot, along with the spare parts, in order to ensure the customer’s success. This is the pinnacle of customer service: knowing your customers, responding to their needs, and delivering what is required, where it is required. Said another way, the purchase (or prospect of purchase) of the product assumed customer service.

The Boeing Company and the Douglas Aircraft Company, which would later merge, led the country in airplane innovation. As Boeing expanded after the war years, the business grew to include much more than just manufacturing, with the advent of airmail contracts and a commercial ight operation known as United Air Lines. Each of these expansions led to more opportunities, namely around a training school, to provide United Air Lines an endless supply of skilled pilots.

In 1936, Boeing founded its Service Unit. As you might expect, the first head of the unit was an engineer (Wellwood Beall). After all, the mission of the unit was expertise, so a top engineer was the right person for the job. As Boeing expanded overseas, Beall decided he needed to establish a division focused on airplane maintenance and training the Chinese, as China had emerged as a top growth area.

When World War II came along, Boeing quickly dedicated resources to training, spare parts, and maintaining fleets in the conflict. A steady stream of Boeing and Douglas field representatives began flowing to battlefronts on several continents to support their companies’ respective aircraft. Boeing put field representatives on the front lines to ensure that planes were operating and, equally importantly, to share information with the company engineers regarding needed design improvement.

Based on lessons learned from its rst seven years in operation, the service unit reorganized in 1943, around four areas:

-Maintenance publications
-Field service
-Spare parts

To this day, that structure is still substantially intact. Part of Boeing’s secret was a tight relationship between customer service technicians and the design engineers. This ensured that the Boeing product-development team was focused on the things that mattered most to their clients.

Despite the major changes in airplane technology over the years, the customer-support mission of Boeing has not wavered: “To assist the operators of Boeing planes to the greatest possible extent, delivering total satisfaction and lifetime support.” While customer service and the related technology has changed dramatically through the years, the attributes of great customer service remains unchanged. We see many of these attributes in the Boeing example:

1. Publications: Sharing information, in the form of publications available to the customer base, allows customers to “help themselves.”

2. Teamwork: The linkage between customer support and product development is critical to ensuring client satisfaction over a long period of time.

3. Training: Similar to the goal with publications, training makes your clients smarter, and therefore, they are less likely to have issues with the products or services provided.

4. Field service: Be where your clients are, helping them as it’s needed.

5. Spare parts: If applicable, provide extra capabilities or parts needed to achieve the desired experience in the field.

6. Multi-channel: Establishing multiple channels enables the customer to ask for and receive assistance.

7. Service extension: Be prepared to extend service to areas previously unplanned for. In the case of Boeing, this was new geographies (China) and at unanticipated time durations (supporting spare parts for longer than expected).

8. Personalization: Know your customer and their needs, and personalize their interaction and engagement.

Successful customer service entails each of these aspects in some capacity. The varied forms of customer service depend largely on the industry and product, but also the role that data can play.


There are a multitude of reasons why a financial services firm would want to invest in a call center: lower costs and consolidation; improved customer service, cross-selling, and extended geographical reach.

Financial services have a unique need for call centers and expertise in customer service, given that customer relationships are ultimately what they sell (the money is just a vehicle towards achieving the customer relationship). Six of the most prominent areas of financial services for call centers are:

1) Retail banking: Supporting savings and checking accounts, along with multiple channels (online, branch, ATM, etc.)

2) Retail brokerage:
Advising and supporting clients on securities purchases, funds transfer, asset allocation, etc.

3) Credit cards: Managing credit card balances, including disputes, limits, and payments

4) Insurance: Claims underwriting and processing, and related status inquiries

5) Lending: Advising and supporting clients on securities purchases, funds transfer, asset allocation, etc.

6) Consumer lending: A secured or unsecured loan with fixed terms issued by a bank or financing company. is includes mortgages, automobile loans, etc.

Consumer lending is perhaps the most interesting financial services area to explore from the perspective of big data, as it involves more than just responding to customer inquiries. It involves the decision to lend in the first place, which sets off all future interactions with the consumer.

There are many types of lending that fall into the domain of consumer lending, including credit cards, home equity loans, mortgages, and financing for cars, appliances, and boats, among many other possible items, many of which are deemed to have a finite life.

Consumer lending can be secured or unsecured. This is largely determined by the feasibility of securing the loan (it’s easy to secure an auto loan with the auto, but it’s not so easy to secure credit card loans without a tangible asset), as well as the parties’ risk tolerance and specific objectives about the interest rate and the total cost of the loan. Unsecured loans obviously will tend to have higher returns (and risk) for the lender.

Ultimately, from the lender’s perspective, the decision to lend or not to lend will be based on the lender’s belief that she will get paid back, with the appropriate amount of interest.

A consumer-lending operation, and the customer service that would be required to manage the relationships, is extensive. Setting it up requires the consideration of many factors:

Call volumes: Forecasting monthly, weekly, and hourly engagement Sta ng: Calibrating on a monthly, weekly, and hourly basis, likely based on expected call volumes

Performance management: Setting standards for performance with the staff, knowing that many situations will be unique

Location: Deciding on a physical or virtual customer service operation, knowing that this decision impacts culture, cost, and performance

A survey of call center operations from 1997, conducted by Holliday, showed that 64 percent of the responding banks expected increased sales and cross sales, while only 48 percent saw an actual increase. Of the responding banks, 71 percent expected the call center to increase customer retention; however, only 53 percent said that it actually did.

The current approach to utilizing call centers is not working and ironically, has not changed much since 1997.


Data will transform customer service, as data can be the key ingredient in each of the aspects of successful customer service. The lack of data or lack of use of data is preventing the personalization of customer service, which is the reason that it is not meeting expectations.

In the report, titled “Navigate the Future Of Customer Service” (Forrester, 2012), Kate Leggett highlights key areas that depend on the successful utilization of big data. These include: auditing the customer service ecosystem (technologies and processes supported across different communication channels); using surveys to better understand the needs of customers; and incorporating feedback loops by measuring the success of customer service interactions against cost and satisfaction goals.


Servicing automobiles post-sale requires a complex supply chain of information. In part, this is due to the number of parties involved. For example, a person who has an issue with his car is suddenly dependent on numerous parties to solve the problem: the service department, the dealership, the manufacturer, and the parts supplier (if applicable). That is four relatively independent parties, all trying to solve the problem, and typically pointing to someone else as being the cause of the issue.

This situation can be defined as a data problem. More specifically, the fact that each party had their own view of the problem in their own systems, which were not integrated, contributed to the issue. As any one party went to look for similar issues (i.e. queried the data), they received back only a limited view of the data available.

A logical solution to this problem is to enable the data to be searched across all parties and data silos, and then reinterpreted into a single answer. The challenge with this approach to using data is that it is very much a pull model, meaning that the person searching for an answer has to know what question to ask. If you don’t know the cause of a problem, how can you possibly know what question to ask in order to fix it?

This problem necessitates data to be pushed from the disparate systems, based on the role of the person exploring and based on the class of the problem. Once the data is pushed to the customer service representatives, it transforms their role from question takers to solution providers. They have the data they need to immediately suggest solutions, options, or alternatives. All enabled by data.


Mikkel Svane spent many years of his life implementing help-desk so ware. The complaints from that experience were etched in his mind: It’s difficult to use, it’s expensive, it does not integrate easily with other systems, and it’s very hard to install. This frustration led to the founding of Zendesk.

As of December 2013, it is widely believed that Zendesk has over 20,000 enterprise clients. Zendesk was founded in 2007, and just seven short years later, it had a large following. Why? In short, it found a way to leverage data to transform customer service.

Zendesk asserts that bad customer service costs major economies around the world $338 billion annually. Even worse, they indicate that 82 percent of Americans report having stopped doing business with a company because of poor customer service. In the same vein as Boeing in World War II, this means that customer service is no longer an element of customer satisfaction; it is perhaps the sole determinant of customer satisfaction.

A simplistic description of Zendesk would highlight the fact that it is email, tweet, phone, chat, and search data, all integrated in one place and personal- ized for the customer of the moment. Mechanically, Zendesk is creating and tracking individual customer support tickets for every interaction. The interaction can come in any form (social media, email, phone, etc.) and therefore, any channel can kick off the creation of a support ticket. As the support ticket is created, a priority level is assigned, any related history is collated and attached, and it is routed to a specific customer-support person. But, what about the people who don’t call or tweet, yet still have an issue?

Zendesk has also released a search analytics capability, which is programmed using sophisticated data modeling techniques to look for customer issues, instead of just waiting for the customer to contact the company. A key part of the founding philosophy of Zendesk was the realization that roughly 35 percent of consumers are silent users, who seek their own answers, instead of contacting customer support. On one hand, this is a great advantage for a company, as it reduces their cost of support. But it is fraught with risk of customer satisfaction issues, as a customer may decide to move to a competitor without the incumbent ever knowing they needed help.

Svane, like the executives at Boeing in the World War II era, sees customer service as a means to build relationships with customers, as opposed to a hindrance. He believes this perspective is starting to catch on more broadly. “What has happened over the last five or six years is that the notion of customer service has changed from just being this call center to something where you can create real, meaningful long-term relationships with your customers and think about it as a revenue center.”


It would be very easy for Dan to receive a loan and for the bank to under- write that loan if the right data was available to make the decision. With the right data, the bank would know who he is, as well as his entire history with the bank, recent significant life changes, credit behavior, and many other factors. This data would be pushed to the bank representative as Dan walked in the door. When the representative asked, “How can I help you today?” and learned that Dan was in the market for a new home, the representative would simply say, “Let me show you what options are available to you.” Dan could make a spot decision or choose to think about it, but either way, it would be as simple as purchasing groceries. at is the power of data, transforming customer service.

This post is adapted from the book, Big Data Revolution: What farmers, doctors, and insurance agents teach us about discovering big data patterns, Wiley, 2015. Find more on the web at