Customer experience (CX) goes beyond measuring the relationship between customers and companies; it is also about quantifying the hundreds of regular interactions and residual memories that influence future behavior. Specific tools like journey mapping and touchpoint management are keys that employees can use to unlock the code for many in-store and in-person experiences. But it’s important for your team to understand the context in which data is being used to make company-wide decisions.
Data and analytics professionals seem to be at the center of the next big race for talent. In 2015, there was a surplus of people with data science skills. Now there’s a significant shortage. By 2020, IBM expects broader demand for data and analytics talent to reach 2.7 million positions in the U.S. alone.
The competition for talent will be especially intense for companies for whom advanced analytics forms a core part of their proposition — think e-commerce giants, hedge funds and complex system engineers. For them, a dedicated, in-house team of data specialists can be a necessity.
But the rest of us? Not so much. Consider the findings of a Rexer Analytics survey in which more than a third of data analytics professionals say their company never, or only sometimes, puts their analyses to use. This calls into question the practicality of funnelling analyses through
Across our society and in all industries, leaders and their organizations are racing to unlock the value of data, tech-enable business processes, and create better, more digitally-enhanced experiences for customers, clients, and employees. They are working to disrupt their own businesses before somebody else does. This cannot be done without substantial investment in talent. With about 500,000 unfilled tech jobs in the U.S., a number that’s widely anticipated to double by 2020, executives know they can’t hire their way out of the need for upskilled employees. And workers are keenly focused on organizations that will invest in their development and help secure their future in a digital, data-driven economy.
Executives find themselves confronted with decisions about whether to acquire expertise from outside the company, through recruiting, partnerships, or acquisitions. But they can often overlook the idea of upskilling their current workforce. Upskilling can be
Leaders today increasingly turn to big data and advanced analytics in hopes of solving their most pressing problems, whether it’s a drop-off of repeat customers, a shift in consumption patterns, or an attempt to reach new markets. The prevailing thought is that more data is better, especially given advancements in tools and technologies such as artificial intelligence and predictive analytics.
But when it comes to uncovering the motivations and rationale behind individual behaviors within a social system, data can only do so much. It can guide the discovery of a problem, but it won’t determine the solution. In other words, data analytics can tell you what is happening, but it will rarely tell you why. To effectively bring together the what and the why — a problem and its cause, in order to find a probable solution — leaders need to combine the advanced capabilities of big
In 2018, every organization has a data strategy. But what makes a great one?
We all know what failure looks like. Resources are invested, teams are formed, time goes by — but nothing comes of it. No one can necessarily say why; it’s always Someone Else’s Fault.
It’s harder to tell the difference between a modest success and excellence. Indeed, in data science they can they look very similar for perhaps a year. After several years, though, an excellent strategy will yield orders of magnitude more valuable results.
Both mediocre and excellent strategies begin with a series of experiments and investments leading to data projects. After a few years, some of these projects work out and are on their way to production.
In the mediocre strategy, one or two of these projects may even have a clear ROI for the business. Typically, these projects will be some kind
There are plenty of great ideas and techniques in the data space: from analytics to machine learning to data-driven decision making to improving data quality. Some of these ideas that have been around for a long time and are fully vetted, proving themselves again and again. Others have enjoyed wide socialization in the business, popular, and technical press. Indeed, The Economist proclaimed that data are now “the world’s most valuable asset.”
With all these success stories and such a heady reputation, one might expect to see companies trumpeting sustained revenue growth, permanent reductions in cost structures, dramatic improvements in customer satisfaction, and other benefits. Except for very few, this hasn’t happened. Paradoxically, “data” appear everywhere but on the balance sheet and income statement. Indeed, the cold reality is that for most, progress is agonizingly slow.
To create an analytical culture in your organization, you need to nurture the right mindset among your employees. And that starts with creating a culture of analytics in your HR department. How can senior leaders help HR develop a culture in which people think analytically? First, you need to understand the different levels of comfort with analytics in HR, and then you need to decide your approach to hiring and building expertise at each of the different levels.
Understanding your current levels of HR analytics expertise
Our research for the book The Power of People showed that HR professionals can be broadly categorized into one of three groups with respect to their current analytical capability:
Analytically Savvy — These are HR professionals who are formally trained in analytics techniques and are adept at working with data and interpreting analyses.
Analytically Willing — These people are open-minded about analytics
It is no doubt a sign of progress that a significant proportion of organizations and managers today appear to feel guilty when they admit that they are making big management decisions in an intuitive rather than evidence-based way. Indeed, being data-driven has joined the ranks of “innovative”, “diverse”, and “socially responsible” as the one of most laudable features of organizational culture, at least if we go by company websites.
Although feeling the pressure to demonstrate that objective facts — instead of subjective preferences — underlie managers’ key choices is no doubt a major step towards actually becoming a data-driven organization, it’s an ambitious goal for any company, requiring a big cultural transformation, which will need to transcend the wishes of senior leaders to create real changes in how people think, feel, and act at all levels of the organization. And, as with any cultural transformation,
If you were entering the job market in the early 90s, most job descriptions included “Macintosh experience” or “excellent PC skills” in their preferred qualifications. This quickly became a requirement for even the most non-technical jobs, forcing people across every industry and age group to adapt with the changing times, or risk getting left behind.
Today, the bar for computer proficiency is set much higher. There’s an ever-increasing demand for people who can leverage software to analyze, understand, and make day-to-day business decisions based on data. Data Science is now a quickly growing discipline, giving people with any kind of data expertise a serious competitive edge.
Corporate leaders are becoming convinced of the impact that effective data collection and analysis can have on the bottom line, from tracking daily reports against Key Performance Indicators to make informed decisions on where to spend marketing dollars, to monitoring and evaluating customer
We’ve been teaching and testing Microsoft Excel for a decade, and a survey of several hundred office staff we ran suggests we spend more than 10% of our working lives spreadsheeting, and for those working in research and development or finance, it’s more like 30%, or 2.5 hours a day.
Imagine, then, if this substantial proportion of the global workforce were a little better at using the application. Time would be saved, and
Apple recently announced a new feature to the Apple Watch: the latest version will be able to measure heart rhythms and notify patients about abnormal, and potentially harmful, patterns. Doctors, however, are skeptical. Their biggest concern is that the feature hasn’t been rigorously tested and could provide unreliable data, creating a false sense of risk among users and leading patients to ask for unnecessary tests.
While these concerns are valid, doctors shouldn’t be too quick to dismiss the new feature, particularly as it appears amidst growing consumer enthusiasm for wearable devices that measure health behaviors. The Apple Watch has the potential to provide valuable data that benefits the entire health care community.
Monitoring the heart
The ECG app, debuting later this year in the new Apple Watch Series 4, allows users to touch the digital crown to generate an ECG in 30 seconds. ECG stands for electrocardiogram, a recording of
With today’s high demand for data scientists and the high salaries that they command, it’s often not practical for companies to keep them on staff. Instead, many organizations work to ramp up their existing staff’s analytics skills, including predictive analytics. But organizations need to proceed with caution. Predictive analytics is especially easy to get wrong. Here are the first three “don’ts” your team needs to learn, and their corresponding remedies.
1) Don’t Fall for Buzzwords — Clarify Your Objective
You know the Joe Jackson song, “You Can’t Get What You Want (Till You Know What You Want)”? Turn it on and let it be your mantra. As fashionable as it is, “data science” is not a business objective or a learning objective in and of itself. This buzzword means nothing more specific than “some clever use of data.” It doesn’t necessarily refer to any particular
I’ve long been both paranoid and optimistic about the promise and potential of artificial intelligence to disrupt — well, almost everything. Last year, I was struck by how fast machine learning was developing and I was concerned that both Nokia and I had been a little slow on the uptake. What could I do to educate myself and help the company along?
As chairman of Nokia, I was fortunate to be able to worm my way onto the calendars of several of the world’s top AI researchers. But I only understood bits and pieces of what they told me, and I became frustrated when some of my discussion partners seemed more intent on showing off their own advanced understanding of the topic than truly wanting me to get a handle on “how does it really work.”
I spent some time complaining. Then I realized that
The past year has served as a wake-up call for many Facebook users. Between the Cambridge Analytica scandal, Mark Zuckerberg’s congressional testimony and the advent of Europe’s General Data Protection Regulation (GDPR), we have fresh insight into how much Facebook knows about us—knowledge that has inspired many people to re-think what they share on Facebook, how they manage their Facebook settings, or even whether they want to use social media at all.
While Facebook’s algorithm uses our data to show us content and ads that it thinks is more likely to be of interest to us, it can also distort our view of the world by limiting our view to the people and perspectives we find most appealing or otherwise engaging. That algorithm is also the reason that some Facebook threads unfold as civil, respectful (but perhaps insufficiently representative) conversations among like-minded souls, while others turn into all-out
Ten years on from the financial crisis, stock markets are regularly reaching new highs and volatility levels new lows. The financial industry has enthusiastically and profitably embraced big data and computational algorithms, emboldened by the many triumphs of machine learning. However, it is imperative we question the confidence placed in the new generation of quantitative models, innovations which could, as William Dudley warned, “lead to excess and put the [financial] system at risk.”
Eighty years ago, John Maynard Keynes introduced the concept of irreducible uncertainty, distinguishing between events one can reasonably calculate probabilities for, such as the spin of a roulette wheel, and those which remain inherently unknown, such as war in ten years’ time. Today, we face the risk that investors, traders, and regulators are failing to understand the extent to which technological progress is — or more precisely is not — reducing financial uncertainty.
Ming Zeng, the chief strategy officer at Alibaba, talks about how the China-based e-commerce company was able to create the biggest online shopping site in the world. He credits Alibaba’s retail and distribution juggernaut to leveraging automation, algorithms, and networks to better serve customers. And he says in the future, successful digital companies will use technologies such as artificial intelligence, the mobile internet, and cloud computing to redefine how value is created. Zeng is the author of Smart Business: What Alibaba’s Success Reveals about the Future of Strategy.
Many governments are currently rethinking their policies regarding cross-border data flows. Although cross-border data flows grew 45x between 2005 and 2014, according to a McKinsey analysis, events since 2014 have pushed the pendulum to swing away from unconstrained data globalization.
Some policy makers are concerned about individual privacy rights, consumer rights regarding the ownership of data, domestic law enforcement, and cybersecurity. Others are driven by the desire to control or censor online media. Still others hope to create market barriers for global companies — a form of digital protectionism.
Our view is that too much regulation will create, in effect, data islands, which will in turn prevent citizens and consumers trapped on those islands from enjoying the many benefits of tighter links to the global digital economy. These include access to digital goods and services, being part of global supply chains, accelerating and partaking in the
In early June, at the invitation of the European Commission to Brussels (Belgium), I toured some fascinating AI and blockchain-based projects, which the Commission is funding. Across industrial sectors, from healthcare to energy, from construction to retail, engineers are creating new technologies with potentially disruptive implications for the current architectural order of the global economy. One of the technologies, an “AI doctor”, shows great promise for the future of healthcare in Africa.
The solution is called CareAi: an AI-powered computing system anchored on blockchain that can diagnose infectious diseases, such as malaria, typhoid fever, and tuberculosis, within seconds. The platform is engineered to serve the invisible demographic of migrants, ethnic minorities, and those unregistered within traditional healthcare systems. By bringing AI and blockchain together, CareAi uses an anonymous distributed healthcare architecture to deliver health services to patients anonymously. This makes it possible for these
In January of 2018, Annette Zimmermann, vice president of research at Gartner, proclaimed: “By 2022, your personal device will know more about your emotional state than your own family.” Just two months later, a landmark study from the University of Ohio claimed that their algorithm was now better at detecting emotions than people are.
We’ve all experienced some version of this problem: Ask “how many customers do we have?” and the marketing team provides one answer, sales a second, and accounting a third. Each department trusts its own system, but when the task at hand requires that data be shared across silos, the company’s various systems simply do not talk to one and other.
The problem arises because different systems employ different definitions of key terms. Thus, the term “customer” can mean a potential buyer to the marketing department, the person who signed the purchase order to sales, and the legal entity that it bills to accounting. Then people misunderstand the data and make mistakes. These issues grow more important as companies try to pull more and more disparate data together — to develop predictive models using machine learning, for example.
Specialized vocabularies develop in the business world every day to