Talking with Dr. Ann Cavoukian, Privacy by Design inventor

Share This Post

Key Points: 

– Privacy by Design is such an revolutionary concept because it requires proactive action towards protecting user privacy, rather than action after a company gets slapped on the wrist by a lawsuit.

– Positive consent for every purpose of data collection is crucial in Privacy by Design, with the second principle being Privacy by Default. Privacy by Design is required to reach GDPR compliance and it is win-win: it builds trust with users like nothing else and it can increase the quality of the data collected.

– Creating a data map for your organization is the first step towards understanding what you need to get consent for and how to best protect their privacy.

– There is an overwhelming demand for privacy coming from consumers.

Watch the full session:


Patricia:
I’m really really excited to have Ann Cavoukian with us today, who is the three-term Privacy Commissioner of Ontario previously and the investor of Privacy by Design. She also received so many awards, including in leadership for privacy. She was named one of the top 25 women of influence in Canada. Named among the top 10 women in data security and privacy, and also named as one of the Power 50 by Canadian Business, and also the top 100 leaders in identity, and recently she was also awarded the Meritorious Service medal by the Governor General of Canada. So it is such an honour and pleasure to have our advisor and friend here today. Ann, thank you for joining us.

Ann: My pleasure. It’s always a pleasure working with you, Patricia. So it’s truly my pleasure.

Patricia: Really excited to dive right in and ask you as many questions as we have time for. There will be a Q&A at the end, so please put in your questions in the chat.

What is Privacy by Design and how can organizations embed it into their workflows?

Patricia: The first question for you, Ann, is: could you tell us what is Privacy by Design and how can organizations embed it into their workflows?

Ann: Privacy by Design is all about being proactive. It’s a model of prevention, much like a medical model of prevention. You want to prevent the privacy harms from arising. See, when I was first appointed as Privacy Commissioner, this was a new concept, because lawyers — and of course, I’m not a lawyer, I’m a psychologist — but lawyers traditionally apply the law once there has been a problem and there’s a data breach or a privacy infraction, you apply the law and get a wonderful remedy. That’s invaluable. But I wanted more than that. I wanted to prevent the privacy harms from arising. So I wanted a model that could be embedded into one’s operations, into the design, and bake it into the code. So I developed Privacy by Design literally at my kitchen table over three nights and then I took it to work and tried to convince the lawyers, which I did, as to how it could complement regulatory compliance, which is after-the-fact. And so I always encourage companies — startups especially, because it’s so easy just starting up — to get this going. You want to ensure that privacy-protective measures are present throughout your entire operation from the get go. Right from the beginning, when you first start data collection. The primary purpose of the data collection. Do you have positive consent? And then do you restrict your use of the data to that purpose? Later on, if you need additional consent, you can go back if you need additional consent for secondary use. But again, you need positive consent. This goes such a long way to building trust. That’s what I’ve been told by companies who have been certified for Privacy by Design. It builds trust like no other. And right now, there’s such a trust deficit. This builds enormous trusted business relationships.

I wanted a model that could be embedded into one’s operations, into the design, and bake it into the code. So I developed Privacy by Design literally at my kitchen table over three nights and then I took it to work and tried to convince the lawyers.

Patricia: Absolutely love that. And I think we’ve observed that a lot in the new businesses that are popping up and really putting privacy at the forefront.

What do people often get wrong when integrating privacy into their workflows?

Patricia: A really important question is: what have you found that people get wrong when they’re integrating privacy into their workflows?

Ann: You know, they think about it certainly at the beginning, usually. So they start from a good place, but one thing they don’t seem to grasp is that you don’t just get consent for the initial, primary purpose of the data collection and then use the information in a dozen different ways that flow throughout your organization. This is the problem. You need to get consent for every particular use of the data that differentiates itself from the primary purpose of the data collection. So I would say to companies “do you have a data map?” and they always look at me and say “huh?” and I would say “that’s the most valuable thing you can do right from the beginning, because then that maps all of the data throughout your organization and will alert you if you need to obtain additional consent for subsequent purposes that the data are used for that are different from the initial primary purpose of the data collection.” That’s absolutely critical. So if you have a data map, it’s an easy way to map the flow of the personal information throughout your organization and determine if additional consent is required, and then you can obtain it.

Patricia: That’s very valuable advice. How does an organization not become chaotic without a data map like that?

Ann: Good question.

There’s a lot of confusion around the word anonymization. What are some dangers of misusing privacy enhancing technologies?

Patricia: We’ve noticed that there’s a lot of confusion around the world around anonymization, right? And we’ve seen it being misused in so many different contexts, in so many different newspaper articles. Can you tell us what some dangers are of misusing privacy enhancing technologies?

Ann: The goal is anonymization of data, so that you can use the data for a variety of other purposes and obtain value from it without jeopardizing the privacy of the individuals to whom the data relates. But I always say: look, don’t strive for perfection. Just do your best to move forward and try to achieve anonymization. And the reason I say that is when we talk about de-identifying data, you can’t just strip a name or a number, you have to use strong de-identification protocols combined with a risk of re-identification framework that can minimize the risk or re-identification to less that .05%, which is less that the odds of being hit by lightning if you go outside when it’s raining out. You know, damn good odds. But trying to get perfection, I would avoid that. Try to minimize the likelihood of re-identification, and then you effectively have de-identified data which you can treat as anonymized data. And you can use that data for a variety of other purposes in terms of data utility, which can have enormous value to a company without risking the privacy of the individuals involved, because you’ve gone through great lengths to strongly de-identify, anonymize, the data. That makes it a win-win. Privacy by Design is all about win-win. Let’s get rid of the zero-sum mindset of either-or, win-lose, privacy vs. security, or privacy vs. data utility. That’s nonsense. You need both. And it’s never privacy that wins over the other interests, nor should it be, but I sure as heck am not going to have it lose out all the time. So if you make a strong effort to strongly de-identify the data, reaching the goal of anonymization, then you free yourself and you free the data for enormous data utility that can arise afterward.

[I]f you make a strong effort to strongly de-identify the data, reaching the goal of anonymization, then you free yourself and you free the data for enormous data utility that can arise afterward.

What are some practical applications for de-identification?

Patricia: That leads perfectly into what we want to ask you next, which is: what are some practical applications for this de-identified data, because there is that extra misconception that, when you have anonymous data, it’s no longer useful for anything.

Ann: That’s nonsense. Let me talk about smart cities. And you may know that I was retained by Sidewalk Labs when they had the contract to develop the smart city here in Toronto, which is where I live. So I wanted a smart city of privacy, not a smart city of surveillance, and at the time, when they retained me, that’s what they wanted too. They said they wanted me to embed Privacy by Design into the smart city that they envisioned. And after I studied it considerably, I said what we have to do is we have to de-identify data at source, meaning right at the time it’s collected. Because, if you do that, then you strip the personal identifiers right at the time the data are collected. And the reason you need to do that in a smart city is because the tech is everywhere. You can’t just decide here or there. No, it’s everywhere. You risk privacy everywhere. So, you make a policy: I’m going to strip all the data personal identifiers right at the time when it’s collected. De-identify data at source. Then you have massive value from the data you’ve collected for a variety of smart city applications, involving transit, involving the flur of people on highways, and there’s so much you can do with the data. And Sidewalk labs agreed to that from the outset, but then they reneged later on. And when they reneged, I resigned the following morning. Now the good news is, lots of other people have approached me. So I’m not working with a wonderful group called Innovate Cities, and what they’re doing is trying to develop a framework for how you build smart cities in a way that protects privacy. And the way they got me to work with them is they said: we are insisting on Privacy by Design right from the outset. Always de-identify data at source — we heard you loud and clear, that’s what we want. And I said: “Are you sure? Because look, I always look under the hood. You can’t just say that to me. Walk the talk.” And they have assured me that again and again. So I’m delighted. And then another group, I just got contacted from San Francisco and a company, which I can’t name at the moment, in Michigan, and they want to develop a small Smart City application in the Detroit area to lift it up. So they approached me and they want Privacy by Design embedded. Anyways, Smart Cities is one example, because you’re going to have so much collection of data which is very important. You must strip it of privacy risks first.

Patricia: That’s a great example, and I’m glad that we’ll have very obvious examples, more and more coming, of de-identified data being super useful to our day-to-day lives.

What responsibility do developers have when it comes to privacy? And how does that compare to the rest of an organization?

Patricia: One huge question, because developers are just getting familiarized with Privacy by Design, privacy enhancing technologies is still not a well explained field in universities, and we’re just starting to get that little bit of ramp up in the education developers need in order to be able to integrate this into their own projects. So what do you think, for now, are the responsibilities of developers when it comes to privacy and how does that compare with the responsibilities of the rest of the organization?

Ann: It all flows together. If you don’t address privacy issues as a developer, you will fail. And I don’t mean to be extreme here, but if you have personally identifiable data that you are using throughout whatever it is you’re doing in your organization and you don’t address privacy concerns, you will invariably have a data breach or a privacy infraction. And these days, the concern for privacy is enormous. I’ve been in this business well over 20 years. I’ve never seen it — in all kinds of privacy opinion polls, research polls, privacy has just been ranked at the 90th percentile consistently. 90% of those interviewed are concerned about their privacy. 92% are concerned about loss of control over their information. And privacy is all about control: personal control over the use and disclosure of your information. And I point companies who are considering adopting Privacy by Design to what happens if you don’t do it. You will have a data breach or privacy infraction. And these days they are no longer lawsuits, they are class action lawsuits that can cost companies in the millions of dollars. I’m not exaggerating. Target, about 4–5 years ago had a massive data breach in the United States. The CEO of Target resigned. Target had just opened up several stores in Canada for the first time, a few years ago. And we were very excited to have them here. But once there was this massive data breach, as I said, the CEO resigned in the United States, the President of Target Canada was fired, in months all of the Target stores in Canada were closed — gone. People lost confidence. There was just no more business there. So it can cost you enormously if you don’t address privacy. Don’t just do it because it’s the right thing to do. I always tell companies and governments, “just because you have custody and control over someone’s personal information doesn’t mean it belongs to you. It belongs to the data subject. So give them the ability to gain access to their data and be transparent, give them the right of access. When you do that, you may think you’re just doing it for them. No, you’re doing it for yourself. Because you will build trust. And as I’ve said, there’s such a trust deficit right now. It will build trusted business relationships with your customers, which is what all companies need.” It is in the company’s best interest, the developer’s best interest, to embed privacy-protective measures into the design of their operations.

[J]ust because you have custody and control over someone’s personal information doesn’t mean it belongs to you. It belongs to the data subject. So give them the ability to gain access to their data and be transparent, give them the right of access. When you do that, you may think you’re just doing it for them. No, you’re doing it for yourself. Because you will build trust.

Patricia: Love that.

Why is it scary to many organizations to tug on the thread of privacy?

Patricia: Given that developers should have that mandate, the organizations should have that mandate, and it’s very clear that trust is at the center of maintaining good customer relationships and how much privacy can help with that, there’s still — we’ve observed — a hesitancy towards integrating privacy. We’re wondering what you’ve observed in terms of why it’s so scary for many organizations to start thinking about integrating Privacy by Design.

Ann: If people haven’t been exposed to Privacy by Design, which is like 99% of the population, they don’t know that it involves positive-sum not zero-sum, meaning privacy and data utility no matter the business they’re in. The way privacy tends to be portrayed is either-or zero-sum. You’ve got to protect people’s privacy and that’s going to impact your business negatively and that’s too bad. That’s nonsense! But that’s the zero-sum mindset that’s out there. I meet with so many companies and I go into the board room and I see all of these people sitting around the table kind of not very happy to see me and they’ve got to do it and they don’t want me there. And I say, “Give me 20 minutes. Give me 20 minutes to present Privacy by Design to you. And if you’re still reluctant to do it, I’ll leave. I’m happy to leave. But I want to show you how it can build a competitive advantage for your company over the other guys who don’t do it.” and they look at me and go, “Oh, you mean there’s some kind of positive associated with privacy?” I say, “Yes! It’s all about the positive! That’s what I want to present to you! And that’s what I want to show you, that instead of being afraid of the privacy issue, you will want to embrace it, because it will bring you many positive returns and it will increase so many factors.” One of the seven foundational principles of Privacy by Design is to give access to customers of their own personal information. Companies who have done this have come back to me and said, “I love this, because come back to us and tell us ‘that’s not correct — this was the fact two years ago, but that’s changed, here’s what’s going on now,” so they say, “They correct the information we have. It increases the quality of our data.” They love this. That’s just one of the examples of the benefits.

One of the seven foundational principles of Privacy by Design is to give access to customers of their own personal information. Companies who have done this have come back to me and said, “I love this, because come back to us and tell us ‘that’s not correct — this was the fact two years ago, but that’s changed, here’s what’s going on now,” so they say, “They correct the information we have. It increases the quality of our data.”

Patricia: Beautiful example.

Who should be responsible for enacting and measuring privacy within an organization?

Patricia: We touched upon this a little bit: developers should have a mandate. But who else in an organization do you think should be responsible for enacting and also measuring privacy?

Ann: You have to have something the equivalent of a Chief Privacy Officer. If you’re a smaller organization you may not want to go that level, but you need an individual who’s charged with the responsibility of protecting privacy throughout your organization. Which includes looking under the hood, measuring privacy and how effectively it’s being monitored and extended to your customers. This is so important. When I talk to boards, I always tell them to make this individual report directly to the CEO, because if you report through a manager who reports to someone else, managers are afraid of ticking off the CEO and saying, “oh, we’re not doing very well on privacy.” So get them to report directly to the CEO who will understand that protecting privacy and offering the strongest privacy to your customers will give you positive returns again and again, you will gain a competitive advantage. It’s a total positive. But you need to give the Chief Privacy Officer the ability to enact this in some meaningful way so that if he or she sees that it’s not being operated in this department that he feels comfortable reporting on that and saying, “look, we need to upgrade this, it’s just not working here.” They need to be able to speak truthfully and measurement often involves some negative features. You have to prepare yourself for that and understand that ultimately, you will gain by doing so.

Patricia: Perfect.

What should we expect from upcoming privacy legislation around the world?

Patricia: So all of that said, legislations are moving more and more towards including Privacy by Design. Notably, the CPPA in Canada doesn’t seem to be including it. WOuld love to get your perspective on that and would love to get your perspective on how privacy regulations are moving across the world.

Ann: Don’t get me on about Bill C-11 — the upgrade to our Federal private sector regulation — it just makes me gag. First of all, the GDPR, the General Data Protection Regulation in the European Union, came into effect in 2018. Everybody had been waiting for this forever and, to my delight, they included Privacy by Design and Privacy as the Default, which is the second foundational principle of Privacy by Design, which is a game changer. I was delighted. And laws all around the world have been upgrading to meet the requirements of the GDPR so that they could achieve essential equivalence and continue to engage in trade and commerce with the EU without fear of reprisal. Before the GDPR, our Federal private sector legislation was essentially equivalent with what they had before, but no longer! So, in 2017, just before it came into effect, our wonderful Federal Privacy Commissioner, Daniel Therrien, said to the government, “look, we’ve got to upgrade our law! We’re no longer essentially equivalent with the GDPR. We need to upgrade it. PIPEDA is dated from the early 2000’s, and we need to add Privacy by Design into the upgrade because that’s what they’ve done in the EU and, after all, it was created by a Canadian Ann Cavoukian. We’ve got to add that here.” In response to that, in 2018, the Federal Government put out a paper called Towards Privacy by Design. It was all about how they intended to upgrade PIPEDA and include Privacy by Design: yay! Wouldn’t you think “yay”? No! So here comes Bill C-11 — the upgrade to PIPEDA — is there any reference to Privacy by Design in it? No! Not whatsoever. It is just appalling to me. And there’s a whole bunch of other problems with it, which I can go into afterwards, but let me just stop here and say what we should expect from privacy legislation is what they’re doing around the rest of the world, which is embedding Privacy by Design into their upgrades, because they recognize that we’ve got to get proactive. We can’t just have things kick in after-the-fact. We have to get in front of this and have privacy protective measures embedded into our operations, baked into the code, into all of our design.

Patricia: Completely agree. So curious to pick your brain later to see why that may have happened. We are at question period. We have 5 minutes for questions if people want to drop their questions on the right side.

Ann: Patricia, if I can just remind everyone who is listening: privacy forms the foundation of freedom. You cannot have free and open societies without a solid foundation of privacy. Just because surveillance is mounting means nothing. It’s like a chess game: point — counter point. And there’s a huge movement towards decentralization now. There’s a Decentralized Identify Foundation that was created last year. It’s growing dramatically. All of the major companies are included in it: Microsoft, IBM, etc. We are making gains. We never give up.

Q & A: 

Why wasn’t Privacy by Design included in Bill C-11?

Ann: Gee. Let’s ask the Federal Government. I have not one clue. I’m not kidding. And there’s a number of other problems with it. The good thing was, I kept saying to the government, “You’ve got to give the Federal Commissioner order-making power.” I had order-making power when I was Privacy Commissioner of Ontario for 3 terms. It makes a big difference. It’s a stick, which I rarely used, but the other side knows you have it, so they engage in informal resolution with you and you get wonderful outcomes. Federal Commissioners never had it. So I said, “You’ve got to give it to them.” So what do they do? They gave them order-making power in Bill C-11: yay. But then what do they do? They create this tribunal that people can go to to appeal the order. If they don’t like the order, they go to a tribunal instead of going to court, which is what happens with all of the other commissioners in the provinces. This tribunal only has to have 1 privacy person on it. The other people can consist of I don’t know who, lawyers, whatever. And the commissioner’s decisions can be easily undone by going to this stupid tribunal that doesn’t exist anywhere else. Not in Canada, not elsewhere. So where they got this model from, I have no idea. And there are so many other problems as well.

Do you expect upcoming regulation to mandate Privacy by Design beyond what GDPR and CCPA prescribe (penalizing issues after-the-fact: reactive vs. proactive)?

Ann: What I’ve seen is that ever since the GDPR included Privacy by Design and Privacy as the Default, new laws that are being created like in Brazil and other jurisdictions are including Privacy by Design.I’m delighted with that. Let me just explain Privacy as the Default for a moment. There are seven foundational principles of Privacy by Design. The second one is Privacy as the default setting. And it is so important. It’s a true game changer, because what it does is say to people, “You don’t have to ask for privacy. You don’t have to search through all of the Terms of Service and raid through all of the legalese in the Privacy Policy to find the opt-out clause that says ‘Do not use my personal information for any purpose other than the intended, primary purpose of the data collection’.” It takes way too long to do that. We know that people aren’t doing that. Life is too short. But people want privacy like never before. As I mentioned, 90% of them, easily, are very concerned about their privacy. Privacy as the Default flips that on its head and it says to people, “You don’t have to ask for privacy. We give it to you automatically. It’s the default setting. It exists throughout our operations. We can only collect your personal information for the primary purpose of the data collection, for which we need your positive consent. Beyond that, we can’t use your information for anything else. And if a later use arises — a secondary use that we would like to use your information for — we need to come back to you and seek your positive consent.” This is such a game changer. It builds trust like no other. People love it. Companies love it. It’s a win-win.

One of the most useful aspects of data is the associations between different types of data. How do we preserve these associations when we anonymize the data?

Ann: You can anonymize the data and still keep the characteristics of the data linked to it. So you’re not gonna know that this data relates to Ann Cavoukians. But you can ask for information relating to the individual: what kind of work I do? Are they a professional? Do they own a home? Whatever the interest is. You can still have questions linked to the individual, but not ones that reveal the identity of the individual. And this has been perfected. Professor Khaled El Emam, who is at the University of Ottawa and a brilliant individual in terms of the de-identification of data. He does this beautifully and now he’s also expanding the area of synthetic data, which is growing enormously, where you can have synthetic data that is comparable to the personally identifiable data, but minus any identifiers of any kind. And it’s synthetic, so it can be used for dozens of purposes and linkages that you might be interested in. So you don’t lose out on it at all, you just love the personal identifiers.

How do you think the perspective of privacy is different between Canada, thh U.S., and Europe? I know that, in Europe, it’s a human rights issue and in the U.S. it’s more of a consumer issue. Where does Canada stand?

Ann: Probably in between. Perhaps a little closer to Europe. But, increasingly, in the United States, you’re getting more and more privacy-related laws that are amazing and that come to approximate Canada and the EU in terms of the variety of areas which they’re addressing. For example, facial recognition is terrible from a privacy perspective. There are so many false positives associated with facial recognition. Of course, it takes place without your consent, without your knowledge. Nothing. In the U.K., where it abounds: 81% error rate. A study from Detroit a few months ago: 96% error rate. I mean, why are we doing this? A number of states in the United States have put an outright ban on facial recognition, which is brilliant and wonderful. We haven’t done that in Canada yet. So there are areas where, in the United States, they are advancing perhaps ahead of us. And they are trying to come a little closer to adopting a more European model, so stay tuned.

Patricia: Thank you so much, Ann! Rohit says, “Thank you for the great discussion!” That ends our webinar. Always a pleasure. I know you’re super busy, so very grateful to you for making time for us.

Ann: Oh, Patricia, it’s always my pleasure working with you and with all of your team and getting this message out there. Thank you very much.

Patricia: Thank you so much

Subscribe To Our Newsletter

Sign up for Private AI’s mailing list to stay up to date with more fresh content, upcoming events, company news, and more! 

More To Explore

Download the Free Report

Request an API Key

Fill out the form below and we’ll send you a free API key for 500 calls (approx. 50k words). No commitment, no credit card required!

Language Packs

Expand the categories below to see which languages are included within each language pack.
Note: English capabilities are automatically included within the Enterprise pricing tier. 

French
Spanish
Portuguese

Arabic
Hebrew
Persian (Farsi)
Swahili

French
German
Italian
Portuguese
Russian
Spanish
Ukrainian
Belarusian
Bulgarian
Catalan
Croatian
Czech
Danish
Dutch
Estonian
Finnish
Greek
Hungarian
Icelandic
Latvian
Lithuanian
Luxembourgish
Polish
Romanian
Slovak
Slovenian
Swedish
Turkish

Hindi
Korean
Tagalog
Bengali
Burmese
Indonesian
Khmer
Japanese
Malay
Moldovan
Norwegian (Bokmål)
Punjabi
Tamil
Thai
Vietnamese
Mandarin (simplified)

Arabic
Belarusian
Bengali
Bulgarian
Burmese
Catalan
Croatian
Czech
Danish
Dutch
Estonian
Finnish
French
German
Greek
Hebrew
Hindi
Hungarian
Icelandic
Indonesian
Italian
Japanese
Khmer
Korean
Latvian
Lithuanian
Luxembourgish
Malay
Mandarin (simplified)
Moldovan
Norwegian (Bokmål)
Persian (Farsi)
Polish
Portuguese
Punjabi
Romanian
Russian
Slovak
Slovenian
Spanish
Swahili
Swedish
Tagalog
Tamil
Thai
Turkish
Ukrainian
Vietnamese

Rappel

Testé sur un ensemble de données composé de données conversationnelles désordonnées contenant des informations de santé sensibles. Téléchargez notre livre blanc pour plus de détails, ainsi que nos performances en termes d’exactitude et de score F1, ou contactez-nous pour obtenir une copie du code d’évaluation.

99.5%+ Accuracy

Number quoted is the number of PII words missed as a fraction of total number of words. Computed on a 268 thousand word internal test dataset, comprising data from over 50 different sources, including web scrapes, emails and ASR transcripts.

Please contact us for a copy of the code used to compute these metrics, try it yourself here, or download our whitepaper.