Back to Blogs
By Aqilliz
Published on June 15, 2020
As economies gradually reopen across the globe, governments are urgently seeking innovative tools to stifle the coronavirus pandemic and flatten the curve. One crucial aspect of this mitigation strategy is known as contact tracing. Due to its laborious nature, the process has quickly evolved into digital location tracking in many countries—a popular tool for authorities to monitor and contain the spread of the virus through technologies such as Bluetooth. While tracking software and smartphone apps have demonstrably played a critical role in slowing the transmission of COVID-19 as shown by countries such as South Korea, they’ve also raised fundamental questions about personal data and privacy.
Concerns about the long-term implications of such forms of surveillance have led to renewed conversations surrounding data privacy frameworks, as well as how they fit in during such unprecedented times. In this regard, marketers are well-acquainted with having to weigh the costs and benefits of access to personal data. Having dealt with the issues raised by regulators when it comes to tackling the ethics of personalisation and targeted advertising, marketers are well aware of the pitfalls of collecting more data than needed and the process of adequately storing and disposing of it as needed once a consumer’s relationship with a brand concludes.
The current pandemic is a catalyst for much-needed conversations in the data arena. If the last few months have taught us anything, it’s that marketers must recognise that transparency must ultimately underscore how companies collect and share aggregate data in order to build trust and accountability with customers who expect privacy.
How should marketers move forward? We’d like to argue that it all begins with differential privacy.
What is differential privacy?
While we covered this in one of our earlier blog posts, it merits unpacking this again. As a cryptographic algorithm, differential privacy is a technique that allows companies to collect and share aggregate data about user habits while preserving their privacy.
The general idea is that it can provide a company with insights based on data from users, without knowing what exactly that data says or from whom it originates. To obscure an individual’s identity, differential privacy adds mathematical noise to a small sample of the individual’s usage pattern before it’s averaged with other users’ patterns to further mask individual data points.
As more people share the same pattern, general patterns begin to emerge, which can inform and enhance insights about that specific group. Aaron Roth, co-author of The Ethical Algorithm: The Science of Socially-Aware Algorithmic Design, sums it up neatly, describing the process as the ability to introduce "plausible deniability" into a data set.
For marketers who now grapple with the upcoming loss of cross-site tracking enabled by third-party cookies, differential privacy is a boon that will allow for far more precision targeting and personalisation in a compliant manner. Other industries are equally taking note, with the majority of big tech firms today with access to large swaths of data making use of the algorithm for different applications. Most notably, the Mayo Clinic, Facebook, and the Massachusetts Institute of Technology have been working together to explore its use in contact tracing applications. Google also leveraged differential privacy for its mobility reports that aggregated user movement patterns across public spaces once social distancing measures were introduced in different cities.
However, the key here is that in order for differentially private data sets to be accurate, the data sets need to be large so as to ensure that the addition of statistical noise doesn’t impact the veracity of the original data points.
An era of privacy regulation
The roots of differential privacy data back to research coming out of the cryptography community in the late 90s to the early 2000s. Since then, the rise of the data economy has seen companies that find enormous value in collecting, sharing and using data. However, in the wake of prolific data breaches and prominent scandals, lawmakers have recognised the importance of holding companies responsible for end-user data. When the General Data Protection Regulation (GDPR) took effect back in 2018, the digital world was thrust into a new era of privacy regulation, setting unprecedented standards for data protection, transparency, and accountability.
With a host of regulations monitoring how businesses treat privacy and personal data, the marketing industry is waking up to an era where insecure and invasive methods of data collection simply isn’t an option. Given the current regulatory environment, it’s important for marketers to put privacy at the forefront and consider a less invasive way to use data to target consumers, while still getting relevant content in front of the right consumers at the right time.
A data-centric world
At the same time, marketers must balance the reality that consumers have developed expectations to see ads that match their interests, personalised offers, and relevant content to enhance their online experiences. To continue providing such tailored recommendations, companies will need to adapt accordingly.
One approach that companies have considered is data masking—a process of obfuscating sensitive data with fictitious data for the purposes of anonymity—as a solution to protect their data, avoid the cost of security breaches, and ensure compliance. Though this may sound similar to differential privacy, the key difference here is anonymity which reduces the value of the data—an algorithm will not be able to serve personalised recommendations if it doesn’t know what the user’s habits are.
Cognisant that the loss of third-party cookies on Google Chrome would thrust the marketing ecosystem into a state of panic, Google’s Privacy Sandbox is filled with an arsenal of tools built on the very foundations of differential privacy. In a bid to ensure that tomorrow’s internet is privacy-preserving by design, the sandbox contains a myriad of proposals, impacting multiple areas of the marketing ecosystem, from targeting, profile, re-targeting, lookalike modelling, co-marketing, and more.
In 2019, the global marketing data market was valued at US$26 billion and this figure is only slated to grow as we venture into an increasingly data-driven way of life. In light of that, brands and marketers will need to recognise the trade-off between meeting consumers at the middle but also satisfying regulatory demands. No longer left to conversations within academic circles, differential privacy is fast captivating business leaders across the globe, irrespective of industry—and marketers are certainly no exception. With time, we hope to see a global digital economy that can effectively balance consumer rights and commercial interests, fully underscored by privacy-by-design.