In recent years, marketers have lived through the Era of Big Data, and the Era of Personalization, and now we are living through the “Era of Consent.” With the General Data Protection Regulation (GDPR) going into effect on May 25th, businesses will be required to protect the personal data and privacy of EU citizens. For marketers, this means updating your privacy policies, but more importantly, it means finding innovative new ways to connect with customers and gather consent to use their data in order to continue your “marketing relationship” with them.
Marketers across the European Union (EU) have been preparing for this new regulation for months. Yet the regulation impacts all companies globally, including those in the United States, that collect and manage data on citizens in the EU. Many global marketers are still struggling to understand what steps they need to take to
The term “frictionless commerce” is widely used to describe how digital technologies are blending product purchases seamlessly into consumers’ daily lives. In the ultimate manifestation of frictionless commerce, purchases will be automatically initiated on behalf of consumers (with their advance consent) using real-time, integrated data from known preferences, past behaviors, sensors, and other sources. Envision, for example, a “smart fridge” automatically ordering food items it senses are running low. That is not common yet, but ever since consumers were offered the option to shop online from home, rather than having to go to a store, technology has been rapidly removing friction from commerce.
As frictionless commerce accelerates, so will a momentous shift in the node of commerce. In the era of department stores and supermarkets, consumers selected brands from store aisles and shelves. Over the past several decades, the in-store experience has been increasingly displaced by online shopping,
Many mature industries are experiencing significant technological disruption. The automotive industry is being disrupted by electric vehicles and self-driving cars, just as home appliances is being disrupted by the Internet of Things and smart appliances, home entertainment by on-demand content providers, and apparel by online personal stylists such as Stitch Fix and Trunk Club.
Leaders in every industry are no doubt keeping a vigilant eye on such developments, yet one very important aspect of this disruption has been largely overlooked: technology fundamentally changes what makes your brand premium.
The traditional drivers of brand premium are being joined (and to varying degrees supplanted) by newer, tech-enabled variables: software, interactive products, digital interactions, immersive experiences, and predictive services, to name a few.
Here’s how technology is changing the game in the automotive industry:
Product: hardware vs. software. While hardware currently accounts for 90% of the perceived value of a car,
This month will see the enforcement of a sweeping new set of regulations that could change the face of digital marketing: the European Union’s General Data Protection Regulation, or GDPR. To protect consumers’ privacy and give them greater control over how their data is collected and used, GDPR requires marketers to secure explicit permission for data-use activities within the EU. With new and substantial constraints on what had been largely unregulated data-collection practices, marketers will have to find ways to target digital ads, depending less (or not at all) on hoovering up quantities of behavioral data.
At Dorchester Collection of ultra-luxury hotels, we use big data and analytics to help us improve our guest offerings and marketing. Our tool, Metis, analyzes data from online reviews and social media to uncover problems and opportunities. But, as the Dorchester Collection’s director of global guest experience and innovation, I’ve discovered that often the data can only tell you where there’s a problem, not why it exists, or how to fix it. That requires human intervention.
For instance, last year Metis looked at customer sentiment about Parisian luxury hotels. Metis discovered that guests had little loyalty to ours — Le Meurice and Hotel Plaza Athénée — or to our competitors’ hotels. According to Metis’ analysis, guests view Paris’s 5-star hotels as interchangeable. They visit different ones simply to try something new.
But once Metis noted this lack of customer loyalty, it was up to us to figure out
By the end of 2017, Yelp had amassed more than 140 million reviews of local businesses. While the company’s mission focuses on helping people find local businesses more easily, this wealth of data has the potential to serve other purposes. For instance, Yelp data might help restaurants understand which markets they should consider entering, or whether to add a bar. It can help real estate investors understand where gentrification might occur. And it might help private equity firms with an interest in coffee decide whether to invest in Philz or Blue Bottle.
The potential value of the large data sets being amassed by private companies raises new opportunities and challenges for managers making strategic data decisions. While there are plenty of well-publicized examples of data repurposing gone wrong, we think it would be a shame for companies to decide the only option is to hoard their data. Before you
Here’s a truth: many in the marketing industry today don’t really understand brands. They may think that “brand” and “customer experience” are different. But that kind of brand is a layer of communications. That is the stuff of blowhard manifestos — too high-minded to sell anything, and too lofty to be useful. Today, when there’s more of everything — more channels, more choice, more speed, more confusion — more noise and less signal, it’s fair to ask: What is a brand?
In order to answer this question, we have to think about what has changed over the last decade. The emergence of the iPhone and smart technology completely altered the way consumers interact with media and brands. “Digital” and “social” have become inseparable from everyday life. People are consuming media and content as well as curating and creating it. Consumers quickly became accustomed to the opportunity
Companies are collecting more data than ever before, and are making significant business decisions based on it. Of the 4 Vs of Big Data (Volume, Velocity, Variety, and Veracity), we have now seen ample evidence of the impact and importance of the first three. A higher “Volume” of data has led to more efficient decision-making in numerous instances, such as in programmatic marketing and in banking. Research has shown how leveraging high “Velocity” data — such as data from mobile devices — has unearthed knowledge that has helped firms better understand their customers. The significant potential of high “Variety” data — data that is unstructured in the form of text, images, videos, and so on — to make better predictions has been documented in numerous academic studies. But what about issues related to the accuracy, reliability, and transparency of the data itself, which basically comprises the fourth V, “Veracity”? In the
With so much information and technology at their fingertips, today’s consumers expect to get things done quickly and have their questions answered in an instant. The same could be said for modern marketers — expectations for their technology stack are on the rise too.
In a recent speech, Alex Azar, the U.S. secretary of health and human services, said, “There is no more powerful force than an informed consumer.” What about an informed provider? If health systems are truly going to improve the value of the care they deliver, they need to enlist doctors in the effort. According to a national survey conducted by University of Utah Health, 89% of physicians believe the overall cost of health care in this country is too high. Now we need to give doctors a chance at engaging in the conversation by developing tools to make cost transparent to them.
For the past five years, University of Utah Health has been working on a tool that does just that. Its Value Driven Outcomes (VDO) initiative provides physicians with cost data to assess health outcomes per dollar spent. VDO is a modular, extensible framework
Communicators and marketers can now adopt a personalized approach to their work, ideally one based on behavioral science. But the execution lags behind the science while the claims of some marketers as to what personality marketing can do far exceed it. Moreover, public controversies like the Facebook and Cambridge Analytica story threaten personality marketing’s potential before it has really matured.
It’s important not to judge a field by its worst actors. Marketers, communicators, and the public alike deserve a better understanding of personality marketing — what it is, how it works, and why it matters.
The personality targeting controversy
Beyond the allegations of misuse of personal information gleaned from unwitting participants in social media, the Cambridge Analytica controversy raised an aspect of marketing that few people knew much about: the targeting of people based on not only on their past behaviors and explicitly stated preferences, but
Media rightfully has been focused on Facebook and its outsized role in what are calling the surveillance economy. But focusing just on Facebook is a mistake, for data accumulation and its subsequent abuse can happen anywhere, anytime. Various data streams are being reassembled for hyper-targeting. And one of these could be Twitter, which sells its data to others.
Ever since Adam Smith published The Wealth of Nations in 1776, observers have bemoaned boards of directors as being ineffective as both monitors and advisors of management. Because a CEO often effectively controls the director selection process, he will tend to choose directors who are unlikely to oppose him, and who are unlikely to provide the diverse perspectives necessary to maximize firm value. Institutional investors often are critical of CEOs’ influence over boards and have made efforts to help companies improve their governance. Nonetheless, boards remain highly imperfect.
Could technology help? Advances in machine learning have led to innovations ranging from facial recognition software to self-driving cars. These techniques are rapidly changing many industries — could they also improve corporate governance?
To explore that question, we conducted a study of how machine learning might be used to select board directors, and how the selected directors might differ from those
How many times have you heard an executive assign familiar aphorisms to business challenges–and you just know it’s a means to justify bad behavior? They might say “business isn’t personal” as an excuse for sub-par treatment of others. Executives demand that employees “do more with less,” but then don’t allow people to focus on less. But the worst of them is one so many leaders seem to cling to: “Make a choice between fast, cheap or good.”
Old-school rhetoric like this produces the wrong answers and leads to more problems. Worse, for digital companies, it’s a practice that will keep the business and its people from truly transforming and competing.
The focus of an organization’s leaders can no longer center around compromising two out of three values. Instead, companies should focus on optimizing all of them. You can achieve all three when you’re working in the right ways
Poor data quality is enemy number one to the widespread, profitable use of machine learning. While the caustic observation, “garbage-in, garbage-out” has plagued analytics and decision-making for generations, it carries a special warning for machine learning. The quality demands of machine learning are steep, and bad data can rear its ugly head twice — first in the historical data used to train the predictive model and second in the new data used by that model to make future decisions.
To properly train a predictive model, historical data must meet exceptionally broad and high quality standards. First, the data must be right: It must be correct, properly labeled, de-deduped, and so forth. But you must also have the right data — lots of unbiased data, over the entire range of inputs for which one aims to develop the predictive model. Most data quality work focuses on one
Over the past year, the healthcare industry found itself under constant attack. Cybercriminals targeted vulnerable clinical networks and poor controls to gain privileged access to medical devices and databases on an almost daily basis. Consider that in just the first two months of 2018, 24 health care provider organizations reported data breaches affecting over 1,000 patients each, a 60% increase over the same time period last year. However, with only 53% of healthcare and public-sector security decision makers reporting a breach in the past year, it’s likely there are many more breaches going unreported.
The threats are only getting more serious. The number of ransomware attacks has surged in the healthcare industry and can cripple a hospital’s network and hinder services. Complicating matters, most hospital networks are “flat” rather than segmented, so infections can more easily propagate from IT to clinical networks. Healthcare data is
“It’s no good fighting an election campaign on the facts,” Cambridge Analytica’s managing director told an undercover reporter, “because actually it’s all about emotion.” To target U.S. voters and appeal to their hopes, neuroses, and fears, the political consulting firm needed to train its algorithm to predict and map personality traits. That required lots of personal data. So, to build these psychographic profiles, Cambridge Analytica enlisted a Cambridge University professor, whose app collected data on about 50 million Facebook users and their friends. Facebook, at that time, allowed app developers to collect this personal data. Facebook argued that Cambridge Analytica and the professor violated its data polices. But this was not the first time its policies were violated. Nor is it likely to be the last.
This scandal came on the heels of Russia’s using Facebook, Google, and Twitter “to sow discord in the U.S. political system,
Machine learning can drive tangible business value for a wide range of industries — but only if it is actually put to use. Despite the many machine learning discoveries being made by academics, new research papers showing what is possible, and an increasing amount of data available, companies are struggling to deploy machine learning to solve real business problems. In short, the gap for most companies isn’t that machine learning doesn’t work, but that they struggle to actually use it.
How can companies close this execution gap? In a recent project we illustrated the principles of how to do it. We used machine learning to augment the power of seasoned professionals—in this case, project managers—by allowing them to make data-driven business decisions well in advance. And in doing so, we demonstrated that getting value from machine learning is less about cutting-edge models, and more about making
Rise Science came to IDEO with a challenge. The young startup had built a robust data platform for college and professional athletes to track their sleep and adjust their behavior so that they played at peak performance. But for the players, the experience was challenging. Rise expected athletes to look at data-driven charts and graphs to determine what decisions to make next, but players struggled to find those insights. Rise was convinced they just needed easier-to-read charts and graphs.
As IDEO designers and Rise’s data scientists spent time with players and coaches, they discovered that Rise didn’t have a data visualization problem, they had a user experience problem. Charts and graphs were far less important than knowing when to go to bed each night and when to wake up the next morning. Within a few weeks, the charts and graphs moved into the background of their app and