Approaching data use based only on compliance puts companies at a disadvantage
DDMA (Netherlands) – Recently we published the 2022 edition of the Global Privacy Study in collaboration with our umbrella organization GDMA. With this survey we mapped out how consumers from 16 countries think about privacy and data use. In parallel to the results, we also like to provide a concrete insight into how these topics are viewed within the Dutch business community. We spoke with Mark Kramer, Director of Customer Experience at KLM Royal Dutch Airlines, about how he sees the growing awareness of privacy among consumers, and how KLM is trying to adapt to this reality.
Mark Kramer is Director Customer Experience with a focus on Data & technology at KLM and is mainly concerned with humanising customer contact and putting the customer first. Data and privacy, along with their accompanying legal and ethical issues, are important themes also in view of his role as Delegated Data Controller for Customer Data.
Make your data processing concrete
The use of data for personalized communication can benefit both customers and organizations. It is crucial, however, that customers clearly understand the advantages of the data processing required for improving the offer or service they receive (personalization). And that they know their data is in good hands. As an organization, you need to have a clear idea of what data you’re using for what purpose, Kramer explains: “By means of describing processing in use cases we try to make our data processing very concrete. The data you need depends on the message you want to send. For some you only need someone’s name and email address, for example. To make a relevant flight offer, you may need more, such as historical data. We try to make this as clear as possible to our customers, but that is and remains a challenge. Often a set of processing leads to the interaction with the customer.”
Data ethics are mainly discussed as a theoretical subject
Personalization and techniques like Artificial Intelligence (A.I.) and machine learning often make consumers feel anxious. So – despite this anxiousness – how then do you ensure that consumers are aware of the benefits of the use of their data? And that at the same time their data is protected? At KLM, this is a continuous process of improvement: “We have drawn up a vision on data privacy, which describes our commitment to our customers and includes all the ethical principles we want to apply when it comes to processing personal data. We want to be transparent about how and which data we use. And in doing so, also offer maximum control to our customers.”
This may sound like a no-brainer. Yet, data ethics are still only discussed and thought about theoretically, but not acted upon, Kramer notes. “In order to achieve concrete guidelines at KLM, we have started mapping customer needs based on customer research within our own customer community and leading research companies like Forrester and McKinsey, but also based on our purpose and brand values: what we want to stand for as an organization. We will soon be communicating our data privacy vision clearly to customers, but we are already discussing new data processing in relation to our vision and commitment to our customers. We think carefully about why we process data in order to determine what is desirable for our consumers and what is not. Especially in large organizations it is important to do this as a whole with the same vision in mind. It is an important fundament to create actual customer confidence.”
Transparency can be guaranteed in many cases, but not always
You can create confidence by being transparent, among other things. Kramer: “We try to communicate clearly throughout the customer journey what we use data for. This transparency is easy to guarantee in some contact moments, but in a lot of cases it’s more difficult. For example, we have established certain design principles for our websites. Furthermore, most communication that goes out is based on the consent of our customers. At the same time, there are also interactions that are difficult to make more transparent, for instance when using historical data to make a relevant offer or improve a service: it needs a discussion to determine which legal basis applies and it also depends on the offer/service which is designated for improvement. We have already made some progress with explaining this to our customers, but we still have steps to take. Ultimately, the entire transparency package must give our customers sufficient awareness and control, which results in trust and earned confidence. We have really set this as a goal and that is why we are also trying to measure it. We are currently working on a panel study to map out how consumers view our data vision and commitment to them. Ultimately, we want to translate this into quantitative surveys so that we can really set targets.”
“It is essential to approach your data vision in a disciplined way; if not, customers will inevitably get the feeling that you only say to be transparent, but in fact you are not. Yet this remains tricky. The privacy paradox may not ensure accurate results. There is indeed a growing awareness of privacy, but we identify that different groups of customers have a different view on privacy and sharing data. In addition, data processing can be very complicated. For example, how do you explain an algorithm to customers?”
Organizations must take responsibility themselves, and not just rely on government-regulated markets
Data-driven marketing is increasingly under pressure, due to legislation on privacy. Nevertheless, there is room for the market to keep data-driven marketing under their own control, for example through self-regulation. This responsibility for the market itself is extremely important, Kramer believes. “As organizations we absolutely must take responsibility. The moment we do that ourselves based on an intrinsic motivation – and not from imposed rules – we create more trust. It often happens that organizations know the laws and regulations by heart and comply with them when it comes to data use. It is then mainly about compliance and less about the need of the customer and what is really important for him or her. And this is not only the case for data use. In more domains it has become more obvious that compliance throws a shadow over customer value. And that’s a big pitfall, because approaching data use based only on compliance puts companies at a disadvantage. You’re less focussed on what the customer’s needs are. If you do, you’re designing your marketing strategy very differently and work from what you need as an organization to help customers, rather than what you need to do to comply with the law. This is the way to differentiate yourself from the competition, by showing that you take good care of your customers. That is a responsibility companies can only take on themselves. An example of this is our aim to offer customers a good deal on products that they have purchased in the past with KLM, like a seat with additional leg space or a visit to a museum via an intermediary. If we want to combine this offer in a service interaction, such as a check-in mail, we need to determine the legal basis for this interaction and for the different types of offers separately.”
Sharing knowledge among sectors can be of great help
Collaboration can be fruitful. Kramer realized this all too well when he sat around the table with other organizations that are currently working on creating an ethical vision of data: “In large companies, people often think that it’s enough to have conversations about ethics within their own company, which is partly true. You can learn a lot from your direct colleagues, but in the end you’ll have a limited view. That’s why it was of great help to us at KLM that we were able, with the support of Dutch Data Marketing Association (DDMA) to discuss with other companies from various sectors about customers’ concerns when it comes to data and privacy. The DDMA Data Ethics Workshop eventually led to a solid framework for our vision and the future launch of principles for the use of personal data at Air France-KLM. ”
“But to create a data vision you have to know what you stand for as a company. That’s the starting point, which for many organizations is still about compliance. How do you become more mature in what you stand for as a company? What values do you have? And how do you translate that into the way you deal with personal data? The market still has a great potential for companies to differentiate. A need in which the GDMA’s Global Privacy Principles and the Dutch Data Marketing Association (DDMA) play a role.”
The privacy paradox is diminishing
Customers are becoming more data conscious. They are thinking more and more about questions like: what data do I share? And with which organization am I sharing it with? According to Kramer the privacy paradox will therefore become a lot less prominent. “An important catalyst for this is the news, for example about Russian hack attacks, or data leaks at the National Health Service in the Netherlands (GGD). At the same time, it is becoming increasingly difficult for consumers to monitor, let alone manage, all the processing of their own personal data. As a result, new parties appear on the market, such as Gener8 or schluss.org, where consumers can indicate their preferences to have their data managed. These companies are the gatekeepers, so to speak, that guard consumers’ data. But this trend is much broader. More parties are responding to the growing awareness. Internet browsers block the use of cookies, and tech companies, like Apple, offer consumers more and more options for managing their data more easily. Companies are increasingly going to see data and the way they handle it as part of a strategy to attract conscious customers that are more aware of the value of their data.”