Have you ever mentioned buying new sneakers to a coworker over the morning coffee and been bombarded with shoe ads over the lunch break? Personalization is an exciting tool that opens many doors for marketers and users alike. However, it can quickly cross into unethical territory. Stephanie Ostermann and Maz Tannir debate where we should draw the line.
Stephanie: Personalization means that customers don’t have to spend time looking at or wading through things that they’re not interested in. Nothing is more annoying than an inbox flooded with mistargeted messages. That is just a better user experience.
For the business, it means that you can really narrow down and target your advertising content to those individuals who are already looking for you. It also allows you to really segment audiences and make sure that you’re reaching just those people that you need at the moment.
Maz: There are a few of them. Number one is the idea of privacy invasion. Personalization often requires collecting extensive data on individuals, which could be perceived as intrusive or raise some privacy concerns.
You’ve also got the potential data security risks down the line. The more data companies are collecting, the greater the risk of data breaches, which can expose sensitive information and harm consumers.
Another concern is over personalization. Excessive personalization can make ads feel invasive or creepy, contributing to consumers’ distrust of the brand.
The last thing would be the potential for discrimination in personalization algorithms. They can inadvertently reinforce profiling biases leading to an unfair treatment of certain groups.
Stephanie: Whenever we collect data, we need to make sure that we’re doing it above board, following all the rules, getting the correct consent from people, and explaining what data we’re collecting and how we’re using it in a way that the audience understands.
A big privacy policy full of legal speak is not helpful; clear, thought-out statements are. It’s also a good practice to remind the customers from time to time how their data is used and give them an opportunity to update it.
We get bombarded with so much information that we sometimes forget that we did a search for duvet covers, for example. And then, when we see a duvet cover ad, we think that our phones must have been listening. That’s why transparency is so important, as well as the options to review the data you allow to collect or even opt-out altogether.
Maz: Consent and transparency are absolutely at the top of the list. But there’s also the question of data ownership—who owns personal data, and how much control should consumers have over their information?
Then, you’ve got the issue of surveillance. It can feel like companies are monitoring our behavior online. It’s taken further by the common assumption that marketing companies are using all this information to exploit vulnerabilities. These are reasonable because there’s a lot more of what companies can do with data sets today.
Stephanie: Clean data is probably the hardest. It’s as simple as people providing a fake name when filling out a form and then feeling mistargeted when they get emails that don’t seem to be addressed to them. And again, asking people to review and update their information is one way of overcoming that.
But it’s also watching what people are doing with the content you have personalized. If they are not even opening the email or not clicking through, you might have the wrong person. Should they be put elsewhere – yes. That way, they’ll get less from you and actually have a personalized experience.
Maz: First and foremost, users choose to be anonymous. Many privacy tools make data collection difficult, which limits personalization capabilities. Then, you have regulatory compliance. Whether it is GDPR, CCPA, or CASL, adhering to them can be quite a barrier.
There are also algorithm limitations. They’re not perfect, and they can make incorrect assumptions, leading to irrelevant or inappropriate recommendations. Plus, there’s an issue of scalability. Personalization is pretty resource-intensive and can be costly.
And then, as Stephanie mentioned, data quality is everything. You are getting data from multiple sources and trying to integrate it into all of these different aspects, which can be tiresome. We hear about all of these technologies that simplify this for us, but are they adhering to regulations? So, there are a lot of challenges to overcome.
Stephanie: Very carefully consider what type of data you are collecting. It can be tempting to collect every single bit possible. But it can become very confusing down the road by giving you too much information.
Just think about what you need to make the user experience as positive as possible. And then forget about the rest. Because the deeper you go, the weirder it gets, and the more those algorithms can make extrapolations that aren’t necessarily true.
Maz: Once again, prioritize transparency and consumer consent. Focus on telling your customers what data you collect and how you are using it. That, and getting explicit consent helps build trust for your brand.
Stephanie: Use personalization methods people opt in to, such as an email list or a text list where users can sign up. This way, you have a clear confirmation that the consumer has agreed to receive some sort of communication from you, and you establish a relationship. And if you can, build your own lists, and then maintain and nurture them—that’s my happy spot.
Maz: Maybe the answer is in the opt-in models. We’ve seen various organizations and their sites do that by asking the user if they would like their experience to be personalized instead of making an assumption. That would allow users to feel more comfortable and companies to know for sure that they have the consent to use the data.
Connect with us to leverage the skills and knowledge Stephanie, Maz, and the rest of the WS team offer for the marketing success of your project.