BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Nishant Bhajaria on Privacy by Design

Nishant Bhajaria on Privacy by Design

Bookmarks

In this podcast Shane Hastie, Lead Editor for Culture & Methods, spoke to Nishant Bhajaria, Head of Privacy Architecture and Strategy at Uber about the need for privacy by design, the hard decisions that need to be made about privacy and the factors which need to be considered when making trade-offs.  His new book provides concrete advice on where to start and how to apply good privacy practices. You can find out more and purchase Nishant's book "Privacy by Design" here.

Key Takeaways

  • While privacy tools are improving there is a lot of confusion about how and where to start implementing better privacy protection in our products
  • Data privacy has to encompass what data to collect, when and how to collect it, who has access to what aspects of that data, in what format, for how long, how to store it, who to share it with, how much to share, how to delete it and many other factors which involve difficult and complex trade-offs
  • The earlier in the data lifecycle that privacy factors are applied the easier it is to ensure compliance and protection later 
  • The gap between the sentiment and the action regarding privacy is really wide – organisations want to do the right thing but are often unsure of how to do so
  • The best way to protect data is to not have it

Transcript

Shane Hastie: Hello, folks. Before we get into today's podcast, I wanted to share with you the details of our upcoming QCon Plus virtual event. Taking place, this May 17 to 28. Qcon Plus focuses on emerging software trends and practices from the world's most innovative software professionals. All 16 tracks are curated by domain experts to help you focus on the topics that matter right now in software development. Tracks include leading full-cycle engineering teams, modern data pipeline and continuous delivery, workflows and platforms. You'll learn new ideas and insights from over 80 software practitioners at innovator and early adopter companies. Spaced over two weeks at a few hours per day, experienced technical talks, real-time interactive sessions, asynchronous learning, and optional workshops to help you validate your software roadmap. If you're a senior software engineer, architect, or team lead, and want to take your technical learning and personal development to a whole new level this year, join us at QCon Plus this May 17 to 28. Visit qcon.plus for more information.

Shane Hastie: Good day, folks. This is Shane Hastie for the InfoQ Engineering Culture podcast. I'm sitting down with Nishant Bhajaria. Nishant is the head of privacy engineering and architecture for Uber. Nishant, welcome back, in fact. It's nearly a year since we last caught up, what's been happening?

Nishant Bhajaria: Well, besides the obvious, the pandemic, I've been working on more courses on LinkedIn Learning, where I teach people on security, privacy, how to build careers, interview for jobs, and also build teams that are inclusive. So, there's a bunch of coursework happening on that front and it's been even more involved because now, I can record those at home. So, that's been keeping me busy. There's obviously my day job, which I enjoy a lot. And then, I'm writing a book on privacy, which is a book towards engineers program managers, executives, people in media, anybody who wants to understand how privacy works. How do you protect data better? How do you move away from just the first principles? Ethics, aspects of the conversation and actually instrument and build some of the tooling behind the scenes. Writing, teaching, doing your day job keeps you pretty busy. And then, of course, there's time for dog walks and just hoping to get that vaccine in which, hopefully, I will have had by the time this makes it to the internet.

Shane Hastie: Indeed, it's been a crazy time all around the world. We've just recently published the InfoQ culture &  methods trends report, and the shift to remote, of course, was the biggest impact on organizational culture. One of the topics we did touch on in that trends report was, in fact, what's happening in terms of ethics and engineering. So, maybe we can delve first into that, privacy and ethics. What's the state of play and are we getting better? Are we getting worse?

Nishant Bhajaria: Well, it's interesting. We are trying to find a number where the numerator is changing because people are getting better. People are building more tools. There's more of an industry coming up. As you know, there's a lot of VC money flowing towards privacy tooling. There's a lot of people getting into the field. There's a lot of regulations coming in and hopefully, getting better on that front, as well. But it's also getting more complex because people's business models are changing. There's more confusion about changes in the industry. People's work habits are changing because of COVID. There's healthcare data moving around, as well.

Nishant Bhajaria: Because of the improvements being cancelled out by the changes, I think, we're largely in a state of homeostasis where the more things change, the more they remain the same. What I'm hearing from the industry across the board is we don't know where to start. We don't know how to measure success. We don't know how to train people better. We don't know how to make sure that we're moving in the right direction. How do we measure? If you cannot measure improvement, you cannot be certain that improvement is taking place. So, it's a two-faced answer here. One, things are getting better, but second, it's hard to ascertain because so much is also changing in the industry, right now.

Shane Hastie: So, if we were to start, and try, and figure out a way to measure, where would we start? What would be the numerator for that baseline?

Nishant Bhajaria: I think the numerator for that baseline would be ensuring that there is some level of organizational consistency and tooling. Shane, when you and I talked last time, we agreed on the fundamental principle that executive sponsorship needs to be geared towards ethics and culture to make sure that we do the right thing in the company. Yes. You agree upon that. Everybody goes home. They get their t-shirts and they tell each other how wonderful ethically they are. But then, there comes a time to make the right choices. How do you make sure that the tools get built? Tools to delete data. Tools to detect data when data should not be collected. Tools to ensure that you don't share data with unscrupulous actors. Tools to make sure that you understand who has access to what in the company. And that's where the choices have to be made because people now have to make some trade-offs.

Nishant Bhajaria: You have gotten accustomed to engineers having access to everything. You have the ability to monetize data without any restraint. And now, you're going to have to be taking some of that stuff away from engineers and the business leaders. How do you build the tooling for that? Where does the resourcing come from? Is it a centralized team? When that team builds resources, how do you make sure that people adopt that tooling? It's a bit like you are putting a sticker on your fridge saying that, "Inside here are desserts. Do not eat them while their desserts are inside." Who puts the sticker? Who enforces the obedience of the person living in the house? I feel like that's where the confusion is. And that's what motivated me to write this book that I'm talking about here because I feel it gives people the tooling to get started and help people understand the trade-offs as to what happens if you don't build the tooling right, and what happens if you don't get that started early on. That's kind of what I'm picking up across the board from the industry.

Shane Hastie: If I am a leader in an organization and I want to improve our privacy practices and approaches, you said people don't know where to start. Where would you start?

Nishant Bhajaria: Well, I would start early in the ingestion pipeline. So, one of my favorite diagrams that I talked about when the topic of security, governance, or privacy comes up is think of a funnel and place it horizontally, and the narrow end on the left-hand side, and the broad end on the right-hand side. And as data enters the funnel, more people access it. People infer things from the data. They join that data to other data in the company. They connect it to other data they had previously. They infer things from it. They build MN models based on it. Because of that, the TLDR is, and I know I'm kind of repeating myself a little bit, as you move left to right in the funnel, the data size grows. So, the longer you delay any sort of tooling, the more data you're dealing with. You might be dealing with duplicative data.

Nishant Bhajaria: You might be dealing with data that is redundant, that is not high quality. So, you are now paying more to store data you may not need, and data which, if used, gives you the wrong answers. But somehow, this phenomenon of data envy, where you feel like you should have all this data because it might be useful down the road, makes your job much harder downstream. Because your ability to delete that data, obfuscate it, protect it, encrypt it, manage access to it. All of that is reduced because you are deluged by the volume of data. So, I would start as early in the process as possible.

Nishant Bhajaria: I would start at the point of ingest and say, "Here's what's entered the system. We have an understanding of what it is, what it means, what to do with it”, and make several smaller decisions early in the process, rather than making that one big decision downstream. Because when we delay that one big decision, we keep delaying that one big decision. And the delays get longer, the decision gets bigger, the stakes get higher and we ended up not making a decision until somebody makes that decision for us. Either by passing a law that is not easy to comply with or a breach that is hard to contain. I would recommend putting in place governance tooling as early in the pipeline as possible.

Shane Hastie: How do we balance the need to generate insight and provide customer service, provide better products, with these privacy concerns?

Nishant Bhajaria: I would say... Let me try and phrase the question a little bit and you can set me straight if I didn't understand it right. Are we suggesting that these privacy concerns are not fixable? Are we saying that it's endemic to how the products get built? I just want to make sure I understand your question correctly.

Shane Hastie: I picture myself in that leadership position, I want the benefit of being able to see all of that data, get the insights. But I also want to do what is right by my customers from a privacy perspective. So, it feels to me like there's a trade-off, a difficult balance here.

Nishant Bhajaria: Then, there is and that's kind of what I was getting at. Which I feel like, when you look at it through that lens, it almost feels like every product is going to have some privacy concerns. And the challenge, Shane, is that the shadow IT world where engineering development has become so decentralized. And because of that desensitized, means that the volume of data is so high, the unstructured nature of data, but you don't actually know what is in those blobs of data until the right person accesses it, means that it is very hard to know exactly what you have and why you have it. It's a bit like you have to look inside to know what's inside by the time it's already in the house, and you don't know exactly how to get it out the door. So, my recommendation tends to be build early detection models. Ask people, who needs this data. Understand from a lineage perspective, how this data ended up flowing downstream.

Nishant Bhajaria: Here's where you can benefit from the legacy that you have built up in the company. You have all of this past history of data, where you have tech debt, which is what is known as the old technical data that you have. Understand how did it get into the system. Who collected it? Who used it? How long was it kept of? And performing an analysis, it's a bit like going to the doctor and getting an MRI done, or an annual physical done. It won't cure you right away, but at least you'll know which numbers are off. And then, you know what to cut back on. And gradually, the old debt will also age out, or you need to know how to delete it better, but you need to understand what you have been doing wrong. It's a bit like, and I hate to use the analogy of alcoholism, but somebody told me a while ago, they have a psychology degree.

Nishant Bhajaria: They said that one reason why people who are alcoholics are afraid to give up their addiction is because there's a moment when you stop drinking and you are sober, and then you realize what you look like and all the damage you've done, and you are afraid of that moment of reconciliation and facing up to your mistakes, or your weaknesses, or whatever, call it what you will. And that fear is what keeps people from stopping. And they keep going on that destructive path. Data collection is sort of consuming alcohol. You are intoxicated by this volume of data because it tastes so good and the money is so great. And I feel like that's what stops people from doing the right thing because they're afraid of what that pause is going to mean. "The other guy's going to stop, as well, or is it just us? Or will it break all the systems that depend upon this unmitigated flow of data?"

Nishant BhajariaMy advice is intercept data early on. Identify what it is. Tag it so you know exactly what the risk it entails. How it maps, say, a law like GDPR, or CCP or CPR, or what have you. And embed into the data decisions about what it means to have that data. So, who should access it for how long? Could you share this data with somebody else? So, as an example, if you are collecting data about people's shopping behaviors, to run and give them an ad, it may make sense to collect some data in a way to surface an ad right away because you want them to make a purchase right away. It makes sense. That's how the internet works. But if you are sharing that data with an outside vendor, to give them an ad the next day, you might want to anonymize that data so that maybe, you key-off of some other ID, which you can be used to give them that ad in specific environments where they have agreed to be served data that was collected from them before.

Nishant Bhajaria: There are these trade-offs where, how you collect data, what you use it for when you use it for, how identifiable that data is, you can only make these sorts of decisions if you intercept that data early on. Again, I'm going to bring back my horizontal funnel. If you make this decision further down the funnel, at which point the volume of data has grown, your ability to detect data has shrunk. And therefore, the error rate has gone up, significantly, and that's where laziness takes over. "Will it go? We'll do it later on." And that data then remains in your system. It gets copied to 50 other places. It gets shared accidentally to some other vendor, and then they suffered a breach. And then, you ended up getting fined. Your customer stopped trusting you. And then, everybody wakes up downstream saying, "Oh, my goodness. How did this happen?" Well, the mistake was made several steps before, and it's almost impossible to reverse engineer and fix it after the fact. So, if that makes sense there.

Shane Hastie: We have so many sources of data today. There's everything we're carrying around with us. There are all of the apps, all of our devices, desktop, laptop. My smart light bulb knows things about me. There is this plethora of this lake of data. How can I, putting this on the other end, how can I, as a consumer, start to feel comfortable that people are using my data well? But I want the benefit of being able to tell my smart light bulb to turn on when I walk into the room.

Nishant Bhajaria: Yeah. And that's where there is an interesting dichotomy, Shane. It's like Abraham Lincoln basically said that, "Democracy is the worst form of government, except for the rest." And technology is the same thing. We can dislike tech companies all we want to, but at the end of the day, there is a back and forth here. The data fuels the products. The products then produce more data, which then produce other products, and back and forth it goes. Where I think people need to be careful is, at some point, when you collect data to... For example, if I'm collecting your data to make an appliance that's responsive to your needs at immediate notice, there is a definitive expectation where you provide me data that is tied to your needs, that is tied to a specific timeframe. What happens if that data then gets used for something completely different? If you consent to let your data be used for that product to be adaptable for your needs, what happens when that data then gets used to market other things to you?

Nishant Bhajaria: What happens when that data then gets sold to another vendor, that then builds a website or a quiz marketed to you, that then gets used to collect even more data. So, now you see how... Data is a bit like energy, it never completely disappears. It just keeps regenerating. How do you ensure that what is collected gets used for its intended purpose? How do you make sure that it is deleted on time? How do you make sure that there is trust on that front? Or in this case, I may collect your data that was mapped for your usage if you will, but then I can aggregate the behaviors that you demonstrated but disconnect those behaviors from your ID. So, I will not know that it's Shane Hastie's data, but I will know that somebody in their forties or thirties who lives in the ANZACs, uses their home appliance at this time of the day and have specific things done.

Nishant Bhajaria: And that enables me to build behavioral profiles of users like you, and build better products and invest better, right? That's okay, theoretically speaking it, as long as I'm protecting your privacy but also fueling my business. But those decisions are only possible if you understand exactly what the privacy implications are of the data usage. For example, I can disconnect your ID, but can I also disconnect your IP address? If you live in a big city, say, Auckland, or Napier, or Dunedin, well, there's a ton of people who live there, maybe I don't identify you. But what if you live in a small town like Elko, Nevada, where maybe very few people live. So, there identifying of your data means, it's as good as identifying you. So, there are trade-offs that have to be made based on location, as well. So, those sorts of decisions need to be made and it requires some level of early intervention.

Shane Hastie: Tell us a bit more about the book. What's in it? Why should I buy it?

Nishant Bhajaria: I think, you should buy it because it helps you how to build privacy into your company. It's one of those things where the longest leap, at this point, is between what gets discussed in privacy, in the media, and actually doing it right within your company, in a way that is responsible, scalable, transparent, and credible. The gap between the sentiment and the action is really wide. And it is not helped by us moving so quickly from, "Oh, my goodness. Who cares about privacy?" This was four or five years ago. It's on the press every single day. Company after company, politician after politician wants to talk about this. But sometimes, to me, for those of us that have been in this field for a while, it feels like a bit of a feeding frenzy where it's like, I'm not sure how many people actually know what it takes to build some of these tools or the trade-offs that need to get made, or how do you get executive buy-in. There's a rhythm to it.

Nishant Bhajaria: There's a strategy to it where you build a little bit, you show outcomes, you demonstrate that the business can still function. You can protect customers and actually have a value proposition and also do the right thing for privacy. And then, you use that when to get more buy-in, and you get that buy-in and build more products. There's a sequence to it, but it takes time for those results to show up. So, this book will help you build tooling, that will help you understand data governance, that will connect all the new stories. For example, the New York Times had this pretty detailed report, back in late 2019, about how President Trump and his entourage were detected as they were traveling in Florida, where he played golf at Mar-a-Lago. He had a meal with the Japanese Prime Minister, and then he came back for the fundraiser. And the reason that happened is because somebody in his entourage had a smartphone with an app that was broadcasting location data.

Nishant Bhajaria: Journalists sitting at the New York Times headquarters were tracking the most protected human being on the planet because somebody in his entourage had a smartphone that was broadcasting location data. Find that great information, but what does that mean in terms of tooling? What does that mean in terms of data deletion? What does that mean in terms of sound data governance? How do I review my products better before they get shipped out the door and start collecting a ton of data? How do you instrument those things? How do you hire people? How do you train people? How do you build the UI? How do you build a back-end databases? What are the trade offs? Now, this book is interesting for me, and I'm always be an interested party cause I wrote it, but it has enough learnings from my past where I tell people, "Here are the patterns you need to observe."

Nishant Bhajaria: It is not overly prescriptive, where it doesn't tie itself to a specific set of technologies, but it's instructive enough where you can read it and understand what to do and what not to do, and apply it to your unique circumstance. What makes privacy challenging, Shane, is that everybody's tech stack is just a bit different. Everybody's business lines are just a bit different. So, how do you take a book and apply it to all of these different skills in different domains? Because remember everybody builds products differently in the company. People have different release cycles, different code teams, et cetera.

Nishant Bhajaria: They don't often talk to each other. For privacy, we're going to have to stitch these alliances together. So, this book will help you provide the context. One last example I'll give is, countries that are great, I believe, are democracies because democracies have hardware and software. They have hardware that is, they have armies, they have good business systems, they have civic institutions. And they have good software that is, they have communities, they have infrastructure, they have educational institutions, laws and truth, values that we respect. Privacy requires both hardware and software. Privacy requires tooling and organizational alignment. And this book provides both. It's like democratizing privacy in a way that is actionable.

Shane Hastie: What are the trends? We spoke right at the beginning about ethics, and you said, "We're getting better." But the challenges are getting greater, so we're sort of sitting in the middle. What are the trends that you're seeing that we need to consider when we look forward for the next 6, 12, 18 months? I

Nishant Bhajaria: I think trends are interesting. And I predicted a year ago, today, actually the 5th of March 2020. And I told my friend that COVID was going to be a two-day story, and that did not wear very well. So, I'm a little humble when it comes to broadcasting the future. But I do know more about privacy and security, and software, and business than I do about medicine. So, maybe I should stick to my domain of expertise. So, my trend, when it comes to privacy, my prediction is we're going to see a great deal of splintering. My hope would be to have some sort of, catch-all, federal legislation in the US that provides clarity on exactly what privacy means. How do you apply it? How do you build it? Things like that. What, instead, we're probably going to see, several state level laws, different interpretations of those laws at maybe, the local or federal level. We're going to see a lot of confusion and what's going to happen, as a result, is there's going to be tons of vendors.

Nishant Bhajaria: We're already seeing vendors, some specializing in consent. Some specializing in data governance. Some specializing in data detection, and protection, and encryption. And we're going to see this patchwork model where you will have on the one hand, more awareness, more tooling, more regulations, which has this feeling of having what airport security because you feel if all these folks are watching us, surely the bad folks won't get on the plane. On the flip side, it's going to be very hard to measure because when you have so much siloing of the efforts that happen, it's going to be very hard to judge the effectiveness of how well is the user being protected.

Nishant Bhajaria: You're going to see maybe, breaches or privacy harms would become big news stories, which is going to be used as evidence that nothing is working. Or you're going to see this long period of no activity, where people are going to think they're okay, all these laws are working because we're not seeing all these bad stories in the press anymore, but you might still be making mistakes. So, we're going to see a lot of false positives and false negatives. That's kind of my somewhat, not very optimistic narrative for the future.

Shane Hastie: How do we turn that around? How do we turn that into a more positive story?

Nishant Bhajaria: I think how we turn that into a more positive story is by ensuring that companies invest early in privacy. And I have made the case in the book that you can invest in privacy early and make a very affirmative use case to improve your relationship with the customer. When you delete data, you end up with less data, but you have more of the data you actually care about, which means fewer queries, fewer burn cycles, fewer copies of bad data. And as a result, you have a better end outcome. You spend less money storing data you don't need. One of my friends who does PR in privacy and security, she's very sharp, she tells me that the best way to protect data is to not have it. So, if you have data that's sitting in your warehouse or sitting in your Cassandra storage, expecting to be used later on, you are also spending money to protect that data.

Nishant Bhajaria: You're spending money, managing encryption keys, access control algorithms to protect it, et cetera. If you delete that data soon or obfuscate it, you are not spending as much money protecting it. So, the case I'm making is, you lean in a little bit early, you spend the money early on, and as a result, the investment you have downstream ended up getting better. You have better products. You end up saving money, downstream. You have less security risks. You have less of this worry as to something bad will happen. So, I would say, don't wait for the government to tell you what to do. We all know what the right way to do this. There's tooling examples in the book. There's other examples out there. Use it, make those investments early, and then use the regulations of this world as a floor, not a ceiling.

Shane Hastie: One of the things that you spoke about was shifting to the left, earlier on, before stuff comes into that inverted funnel of yours. One of the potential challenges I see there, or is it a challenge? How do our small autonomous two-pizza teams we're working on, things further on the right of the pipe, how do they shift themselves to the left?

Nishant Bhajaria: I think honestly, for some team, Shane, it's going to be very hard to shift further to the left. So, I would say we need to look at governance much more broadly. Governance, historically, is this catch-all term, which basically seems to mean that we're going to do the right thing. We want to check these boxes. "Did you check this feature? I checked it. It's great." But my advice to companies is let's come up with a way to catalog this data early on in the process. What that means is let's assume you are a company that collects data about people's shopping habits. You basically have a user profile. You collect data about them, for example, their home address, and you use that data for two purposes. Again, we're going to be very over-simplistic here. You collect that data to ship products to them, but you also use that address to do some research.

Nishant Bhajaria: That is, what are people in the zip code doing? What kind of products are they buying? When do they buy them? How much do they click on ads? What operating system are they using on their smartphones? Things like that. Because that tells you a lot about purchasing patterns. Now, these teams downstream that you speak of could serve two purposes. They could either serve a fulfillment purpose. In which case, they are shipping stuff to people that they bought and paid for. In which case, they need to have access to the full address because you can't ship something to my house unless you know exactly where I live. But if you are using my purchasing habits to make inferences about what I might like to buy in the future. So, as an example, if I bought a new leash or a new coat for my dog, you could infer that maybe I've got a new pet and maybe, two weeks down the road, I'll need pet food. Four weeks down the road, I might need pet supplies.

Nishant Bhajaria: Six weeks down the road, I might need medical refills. Those sorts of inferences you might do, not just on me, but for a whole bunch of other users. So, when these downstream teams get that data, you might want to anonymize my address, my email address. Obfuscate my home address, as well. Maybe, round off some decimal points from my IP address. So, you can call it a bigger circle and get more data, analyze more information, but not aim it towards me. So, I would say apply tags to the data that says, "Home address underscore single use, underscore user purchase." If I see that tag, I know that this data is very granular. It has a human being's specific home address. I should only use it to ship something. But if I'm doing analysis, maybe I should let this data go or delete it.

Nishant Bhajaria: On the other hand, you could come up with a separate tag that says, "Home address, underscore obfuscated, underscore three decimal points. What that means is, it does not have my home address, it only has my IP address, but only has three decimal points, which means it probably covers four or five blocks, which means you can get more quality analysis by getting more correct data in terms of covering more use cases, but not identify a specific person. So, you have now achieved two goals. You have achieved better data protection for me, from a privacy perspective, and enable the fact that you are now looking at data much more holistically and getting better outcomes. But the only way these downstream teams will be able to make those decisions is if the people who collect the data upstream, apply those tags to the data. So, you need better connections between these upstream and downstream teams.

Nishant Bhajaria: I would say, rather than the downstream teams moving them further to the left, I would say have them talk to the people, the people who collect the data. The collectors need to talk to the consumers. The collectors no longer get to say, "I collected it. It's not my responsibility to protect it." That's historically, been the position on a lot of companies. No more of that. And the consumers can no longer say that, "We're not responsible for what gets collected. We're just going to use what we use." No, these folks need to talk to each other. And the people who collect the data need to know how it gets used. And the people who use the data need to know how it gets collected. That bridge is going to be very critical. I'm okay with people living in your islands, but let's have some bridges in between.

Shane Hastie: Those bridges are sometimes hard to build.

Nishant Bhajaria: Yeah, it cost money.

Shane Hastie: I'm going to go beyond money. There's a culture shift here, isn't there?

Nishant Bhajaria: It is. That's one of those things where the volume of data sometimes, wipes away the significance of it. I was talking to a friend of mine the other day, and she was complaining about a specific politician who lies a lot. And she said that this guy lies so much, his lies almost don't matter. And there's sort of a truth to it. Where if you lie so much, people stop believing you and it almost doesn't matter anymore. So, what happens? The parallel I'm drawing here is that the data that is collected is so voluminous that people forget how important that data is for individual users, because there are so many users in this case. When you have privacy rights being violated for so many users, an individual user often is forgotten. My push from a cultural perspective is that the people who collect the data are just as invested as the people who use it.

Nishant Bhajaria: Because what happens is, if you end up with data being collected incorrectly, and somebody has to go and delete the data, all those changes have to be made pretty quickly. They have to be made pretty suddenly. And now everybody's impacted. The people who collected correctly, and the people who collected incorrectly. The people who use it wisely, and the people who use it unwisely. The people who manage the data platforms. The people who manage the microservices. The people who manage the hive storage. The people who run the queries. The people who do the audits. I wouldn't want to be the engineer, whose cavalier behavior towards data, causes so much churn downstream. So, the culture change here is I could do 50 correct things, but the one wrong thing I do could be so disruptive towards the company's reputation towards the end user. So, why not do the right thing at the get-go?

Nishant Bhajaria: Because just as I wouldn't like it, if somebody else disrupts me, I wouldn't want to be the one disrupting somebody else. So, the culture change is about doing the right thing in a way that doesn't just benefit you, it benefits everyone else. And I feel like this has to come from the top. It's like innovation initiative always starts bottom up, but culture has to start from the top down. If you see senior executives in the company, senior leadership leaning in and saying, "Let's do the right thing from a privacy perspective. Investing in it and then making the hard choices and making sure that the wrong stuff doesn't ship out the door, unimpeded." Those choices send a message. And I think, the cultural choice is important. And that's why I was leaning into the money aspect, because I usually try to make the argument with money first, because you make the money argument to get people's attention.

Nishant Bhajaria: And then, you make the cultural argument. In some cases I made the cultural argument for it literally, varies on a case by case basis. But for people like me, that have been in the privacy field for a long time, we're just accustomed to making the financial argument because so much of our early career was spent not knowing where the cultural argument would go. We've had more success with the money argument, but my encouragement where people would be, to lean in both on the money argument and the cultural argument because you don't know exactly who the person on the other side of the table is and what's going to sell better.

Shane Hastie: You said the book's coming. It's called Privacy by Design. It's coming from Manning. Where can people find it, and where can they find you?

Nishant Bhajaria: The book's title is Privacy by Design, and it's on manning.com. If you go to manning.com and enter my name, or if you just Google my name, it's one of the earliest links that comes up. It's going to be released later on this fall. You can find me on LinkedIn. And just reach out to me on LinkedIn, I'd be happy to provide you a link. It's already doing pretty well on pre-sales because there's a significant level of interest based on people who are struggling to do the right thing. What I'm seeing, Shane, is that there's a lot of interest among the engineering population, because these are folks who need to actually handle the data correctly from a privacy perspective. But there's also a lot of interests at the executive level because they routinely get hit with these fines, these budget requests, these headcount requests. So, they need to understand why exactly they're spending all this money, and what the cost of not spending it would be.

Nishant Bhajaria: I'm also seeing a lot of uptake from the media industry. There's a lot of interest among the journalist community. People want to understand exactly what happens behind the scenes. What are the trade-offs from a privacy perspective? I'm seeing a lot of uptake there, as well. I would say, I would not want people to think of this book as being only for the executives, or only for the engineers. It's for literally anybody who wants to get started in this field. There's material in it that's fairly at the beginners level. There's material in it for people that are much further along, and then there's stuff in it for people that are much more advanced. So, it's manning.com/privacybydesign. That's the link for the book.

Shane Hastie: Nishant, thanks ever so much. It's been good to catch up.

Nishant Bhajaria: Definitely. Thank you so much Shane, again, for having me.

Mentioned

 

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and the Google Podcast. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

BT