Surveillance marketing: Too much personalization can hurt your brand
Personalization is often touted as a panacea in the world of marketing. An omnipotent force with the ability to recognize our desires and needs and make the world of advertising and experiences more relevant. All in the noble cause of selling more stuff.
A study published in the Journal of Applied Psychology found that personalized ads attract more attention and last longer in the memory. Salience and mental availability are fundamental to advertising success. So all good?
Well, there is also research suggesting that as consumers learn more about how advertising personalization works, they like it less. A recent YouGov study found that 45 percent of UK consumers are against their data being used for personalization of information, services, and advertising, and 54 percent find personalized advertising creepy.
This fundamental dilemma between privacy and personalization can explained using a simple self-help tool.
In 1955 two clinical psychologists, Joe Luft and Harry Ingham (or Jo-Hari) created a framework for understanding people and their relationships with others, the Johari Window. This two by two grid explores the crossover between what we know about ourselves and what others know about us.
The Arena
The first area is the Arena, which comprises information that is publicly known about us (our height, our gender). This is where most good marketing occurs, from broad audience definitions to specific filters. Alcohol and tobacco companies cannot target consumers under a certain age, which varies by country. Sunscreen is sold to people at airports. Personalization using publicly available information is generally uncontentious and is often demanded from brands. I don’t expect my bank to cross sell me products I already hold, and if I have never bought meat from Tesco, I don’t expect Tesco to target me with meat promotions.
The challenge with this type of personalization is that it borders on good common sense. People don’t buy meat because they are vegetarian. You don’t need a machine learning model to work that out. Many results produced by this type of personalization are often compared to a random sample – i.e. something dumber than common sense, and in turn the performance uplift is overstated.
The Facade
The Facade comprises information we know about ourselves but others do not. Many people compartmentalize their lives. Some people would hate for their work colleagues to learn of their weekend pastimes. Others simply want to keep their private lives, well, private. Almost everyone breaks into a cold sweat when the term “browser history” is raised in polite conversation. There are serious implications and consequences to personalization fueled by data from behind the facade.
This is where personalization becomes creepy. Advertising may be perfectly targeted at an individual, but the target may be appalled that an advertiser has connected their supposedly secret gambling or porn habit to a personalized offer displayed in front of family or colleagues.
A famous example of this is former UK Conservative MP and current Downing Street strategist Gavin Barwell, who on seeing an advertisement to “date Arab girls” on a Labour Party online press release, tweeted a complaint about it, and therefore outed his own browser history. (I am sure he probably shares his browser with a team of interns. That’s what I would say.)
Brands that leverage data from the Facade are risking the wrath of the consumer and are also at serious risk of reputational harm. Kashmir Hill suggested that Facebook’s “people you may know” algorithm has connected patients of a mutual therapist to each another. Beyond breaching patient privacy, there is a danger of introducing people at risk to one another.
The Blind Spot
Our Blind Spot is information our friends and family might know about us but we are oblivious to. For example, if I were served an ad for Listerine, a popular mouthwash product, and I didn’t know I had bad breath (but my wife did), it would be wasted on me because the message is irrelevant to someone who doesn’t know they have a need.
Worse, knowing a consumer’s blind spot can lead to exploitation. This can range from knowing which customers are more price sensitive at one end of the spectrum, to exploiting customers who exhibit addictive tendencies. Retailers target offers and deals to customers they think are most likely to respond – marketing 101. But how far should this go? Online gambling companies give free money every month to customers who lose the most to them. EA Sports generates $800 million from in-game purchases from its FIFA and Madden titles. Personalization driven by data in our Blind Spot can have an insidious effect.
The Unknown
The Unknown is the final and least charted space. Information that is not known to anyone. This is potentially a hugely profitable space for brands and consumers. Predicting the future is the killer application of data analytics, whereas most analysis today is the regurgitation of historic tendencies.
Some people are keen to understand what their genetics predict about their future health and associated risks in order to live a longer and healthier life. Others are horrified and seek to avoid such predictions.
The problem with personalization
Evidence of personalization in action is abundant. However, scratch the surface a little and a lot of that evidence doesn’t hold up to scrutiny. First, the majority of success stories are created by technology vendors who sell personalization software. Second, the process of personalization means a brand has to produce more variants of communication and content, which increases costs – costs that are often not factored into comparative ROI benchmarks. Third, while personalization often leads to significantly higher ROI, this almost always comes at the cost of scale. ROI is a measure of efficiency under constraints. Anyone claiming a 10x improvement is not talking about profit or market share, but a click-through rate with a baseline significantly under 1 percent.
Aleksandr Kogan, the academic at the heart of the Facebook/Cambridge Analytica affair claims the accuracy of personality profiling to target advertising was extremely exaggerated, estimating that he was six times more likely to get everything wrong about a person than everything right.
Social media platforms provide free services, costing billions to run, and monetize their efforts through targeted advertising. Due to the scale and complexity of these advertising platforms, targeting is managed algorithmically, and those pesky algorithms have not thought through all the ethical quandaries they might be faced with.
A University of Sheffield research paper by Ysabel Gerrard explored the role of personalization algorithms in promoting additional pro eating disorder content to users who seek that content alongside associated topics of suicide and self harm. The algorithm is accurate, the application unethical.
Netflix was recently criticized for the way it personalizes the artwork of films and TV series. Within the Netflix experience, cover artwork is the biggest influence on customer viewing habits. Without feeding ethnicity into the algorithm, the personalization algorithm matched viewers with films containing actors of similar ethnic backgrounds, and redesigned the artwork to feature those actors despite them often being minor characters. An algorithm doesn’t have to be trained on ethnic data to produce results that are highly differentiated for different ethnic groups. Machine learning at scale is full of unintended consequences, because algorithms have no ethics.
The road to hell is paved with good intentions. Personalization is often positioned as a panacea in marketing. It is only ever a seen as a good thing. Yet many ethical boundaries are being pushed in the rampant desire to personalize experiences, particularly the ever-expanding and increasingly intimate data that is hand-waved through with a simple click of OK on a website’s cookie disclaimer.
If nothing else, Europe’s new GDPR legislation has raised the specter of the privacy and security of our data federated across the internet. Companies that heavily rely on personal data from behind the Facade or commercially exploit consumer tendencies hidden away in their Blind Spot may find themselves on the wrong side of a significant shift in public opinion on how our data should be used for commercial purposes.
What’s the alternative?
To mitigate this risk, companies should seek to involve consumers and change personalization from something they do to customers to a joint venture where people can customize their experience. Companies should also audit their own personalization efforts and ask whether they overly rely on data hidden behind the Facade.
Twenty years ago, companies began to report their corporate social responsibility activities in their annual report. This trend towards non-financial disclosure helped to paint businesses as good corporate citizens. Today there is a similar opportunity for businesses to overtly state their data usage policy, seeking not just to comply with the law but to build trust with their customers through additional efforts to treat data ethically and with the respect it deserves.
Source: VentureBeat
To Read Our Daily News Updates, Please Visit Inventiva Or Subscribe Our Newsletter & Push.