BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Nishant Bhajaria on Security, Privacy and Ethics

Nishant Bhajaria on Security, Privacy and Ethics

Bookmarks

In this podcast Shane Hastie, Lead Editor for Culture & Methods, spoke to Nishant Bhajaria about security, data privacy, ethics and privacy by design.

Key Takeaways

  • The Tech industry has a responsibility to do the right things
  • The confluence of great engineering abilities, massive quantities of data and unintended consequences has resulted in major security and privacy breeches 
  • Engineers need to consider the potential unintended consequences of decisions made today and how they will impact customers in the future 
  • Before capturing and storing a piece of data seriously question if you really need it and what it will be used for 
  • Privacy engineering is about proactively preserving and protecting private information 

 

Transcript

00:21 Introductions

00:21 Shane: Good day folks. This is Shane Hastie for the InfoQ Engineering Culture podcast. I'm sitting down with Nashant Bhajaria. Nashant was a speaker on the ethics regulation, risk and compliance track at the upcoming QCon San Francisco conference. Nashant welcome, thanks for taking the time to talk to us today.

00:40 Nishant: Thank you,

00:41 Shane: You and I have just met, but would you mind giving us a brief introduction, please?

00:45 Nishant: Absolutely and thank you for having me here. My name is Nashant Bhajaria and I have been in the privacy and security space for about a decade now. Just to kind of take a step back a little bit, after my graduate school years, I was an engineer at Intel and I did a lot of coding at my web MD days, webMD.com business out of the US, but since then, I've gradually moved into the privacy and security space. I have started and run programs to make sure that, when it comes to privacy and security, user data is safe, we do it in a way that enables the business, that builds trust and people have a better sense of exactly what it means to keep people safe. We in the tech industry have a lot of influence on people's lives. We disrupt people in a lot of ways that we don't always appreciate. And I think ever since the tech revolution really took off, and I mean the second one, post the dotcom revolution. It is very, very data intense, more and more people coming on the internet, more connectivity, more mobile devices, more social media platforms. I believe the responsibility is very, very high for us to do the right thing and for people within the company to be a voice for the customer. So I have been in this field now for about 10 years, basically doing that. A little bit about me outside of work. I care very deeply about animal welfare, about the environment. So those are causes that matter to me. So, at work and outside, I think that our cause is bigger than the most recent earnings report or the most recent product release and I try to bring that same sensibility to my work in privacy and security. So that's a little bit of a brief introduction.

02:12 Consumers expect their data to be secure and private and the tech industry has not met those expectations

02:12 Shane: So let's explore that. You've been involved in privacy and security for a decade or more. It's only, probably, in the last four or five years that privacy in particular has been a topic people have even really spoken about. There was the, I guess, from the consumer perspective, just the assumption that of course, my data is going to be private, but then we learned otherwise. What caused that?

02:37 Nishant: Well, it's interesting, you know, for those of you that are based in the US you would know Justice Potter Stewart. He was a famous justice of the Supreme court, he was once asked to define pornography and he said, "pornography is I'll know it when I see it". Privacy, you don't know what it is until you lose it, and what started happening was that I mentioned in my introduction, you had this confluence where a ton of data started getting collected the tech industry, the government. Let's not forget the government, they have a lot of data as well. And there was a moment when the fact that you can now build a lot of amazing products by a bunch of engineers having unfettered access to information, the combination of those circumstances was going to create some turbulence. What ended up happening was people started making products that have unforeseen outcomes, like the Strava case, where you had a fitness app reveal the location of US soldiers and military bases. You had instances where engineers had access to information that was extremely personal, extremely intimate. In some cases, people didn't even know that data was being collected. So, all of these things came to a head and there were several high profile incidents, security breaches, people behaving badly, I'm not going to let the tech industry off because I'm a part of it. So all of that started happening in a bunch of incidents happened in really quick succession. For me, it's not anything new, I saw it coming a while ago and every time I built a team, every time I talk to engineers or even the executive suite, I tell them that decisions we make today will have good or bad outcomes six, nine, 10 months out. We can either do the right thing today and pay 10 cents for it or we can do the wrong thing now and pay a hundred bucks for it, in reputational damage, in the cost of doing the right thing and cleaning up data. My response to you, and that was kind of a long way of saying that people have always is known, subconsciously, that their data was going to be private, but they also felt that they were making a fair deal, that they were giving some data and they were getting good products.

04:24 We need to actively do the right things in order to rebuild the lost trust

04:24 And I feel that somewhere in the last three or four years, it's become obvious that that contract, that trust has been broken. So I think we have a long way to go to sort of build the right programs, show some discipline and really demonstrate to the customers that we hear, that them, and we do the right thing. And so the course I taught on LinkedIn, the speech I'm giving at QCon demonstrates how you do the right engineering, you provide the right alignment and you demonstrate the right sort of steps to make sure that data is private and then we're going to have a safer internet, people will feel better, and we will have a more engaged, more enlightened exchange of ideas.

04:56 Shane: The other thing that has been quite high profile, is legislation is beginning to catch up, but that's even bringing with it its own areas of confusion. When I think about, I live in New Zealand and we have pretty strong privacy laws, but we learned with the adoption of GDPR in Europe, that there are some distinctions between our privacy laws and the European privacy laws and now in the US different States are bringing in different bits of legislation and they're all have the subtle nuances. How does a software engineer who's trying to come up with policies and actually building something makes sense of the legal frameworks?

05:39 Legislation is a base on which you build on top of, not the bar you aspire to

05:39 Nishant: Yeah. And you know, I'm not an attorney either, so it's a great question. The way I look at this is that when you have too much time without legislation, you then have a time with too much legislation. So what we've had an instance where there was a vacuum and now there's too much rushing in to fill that vacuum. So, I mean, GDPR was interesting. Let me tell you a really funny story about GDPR. When I was at Google, I used the Google gym work and there was this really interesting facility we had where I could sign in with my badge at the beginning and then I could go to any device, it will load up my profile. All I had to do was enter a small code in all my preloaded workouts would just be there, I didn't have to sign in with my username, my password. It was all pretty configured. After GDPR, I had to sign in at the front with my badge and I had to also sign in, with my sweaty trembling hands in the morning, my username and my password. And I was told because it was for GDPR. I have been working in privacy for a long time. I don't know exactly where it says in GDPR that I cannot have my sign in cross from one device to the other in the same gym. There is a lot of confusion. There is a lot of different interpretations that run into this. My advice to people would be that we need to get our internal house in order, without worrying about the inconsistency between the laws first. Most companies, the big ones and I'll get to this in a second, have really good legal departments. It's their job to abstract out exactly what the application for GDPR or CCPA or the law that's been discussed in New York or Washington state, or that again is interesting in multiple different States all these laws have. So let them worry about that Collect less data and be sure you know why you are collecting it and how it will be use I think engineering shouldn't wait for legal in this case. There are several things you can do correctly, right at the outset. Collect less data, ask yourself do we really need all this data, right from the beginning. One of the benefits of cloud computing has been that you can store data pretty easily. You don't have to have an onsite it department. You don't have to have all that overhead. One of the benefits of social media and ubiquitous computing and fast internet, has been that a lot of data is available and you have ready-made customers good to go. But the downside of all that is that there was a big breakdown in discipline. People got careless. So what we now need to do is ask ourselves, do we really need this data? Should we be copying it in 50 different places? Should we be giving everybody access to everything? Should we be making copies of stuff just so people can easily access it? If we need to fix an issue, should we be sending each other information about customers via email or via chat? All of that stuff creates a paper trail. It creates bad habits. It creates indiscipline and then you are one bad employee away from somebody accessing the data incorrectly and all hell breaking loose. And there is not a law in the books that anybody can write that I'm gonna save you from that. So my recommendation to engineers would be, be disciplined, know exactly what you have, where you have it, what you're doing with it, who you are giving an access to. If you're sharing data with a third party, for example, vet that third party, work with somebody within your company to understand exactly who this vendor is, what are their terms of use? What are they going to do with the data? Remember a lot of bad stuff that has happened in the last three or four years, that we know about the big high-profile stories, they were all okay, by the book. When Target was breached, they were PCI compliant. So you can follow every single law in letter and in spirit and still end up in a bad place. So ask yourself, would you like somebody to do your data what you're about to do to somebody else's? If you follow that sensibility at the outset, you are already off to a good start. And then from that point forward, it's a partnership with your legal team, with the PR team, with senior architects, making sure that there is a vetting process. You know, you wouldn't push a code change out though production on a Friday afternoon at four o'clock, would you, because then you might bring the whole thing down on the weekend? Why would you put somebody else's data on a test server that everybody has access to? So I think some of that discipline is important. The reason I'm going on about this so much, Shane, is that sometimes when people would complain about GDPR and CCPA and all the inconsistencies, it often becomes a crutch to not do the right thing. My sense is there are enough gaps for us to fill in the tech industry without criticizing the legal frameworks that are coming in. You know, and there's a lot to improve on the GDPR side, don't get me wrong. I'm not just letting them off the hook either, but frankly, I think there's a lot of things we need to clean house on first and then be a more proactive, positive contributor to the legal landscape and then let's have a good conversation from that point forward.

09:51 Shane: You mentioned some reasonably, what sounds like straightforward, things from an engineering perspective, but what does privacy engineering look like to the tech team sitting in front of the keyboard?

10:06 Explaining privacy engineering – proactively preserving and protecting privacy

10:06 Nishant: Let me give you a sense of how the tech industry really works. You know, I always love reading the Tom Friedman columns in the New York times. We've gone from in 2005, Mr. Friedman talking about the World is Flat, how the tech industry is perfect, no flaws, creating untold wealth. That documentary is pretty famous. To, there suddenly being a tech lash and allegedly, we can't do anything right and there's always excess of admiration or condemnation. So the way the tech industry does business and it does it so well, is that it is extremely, very, very bottom up, at least in theory, very data driven, very bottom up, where you have disconnected silos that can innovate individually where I can do my thing, I can roll out my product and I can really run with an idea without you stopping me and you can do likewise and then let the best product win in this great Darwinian race to the top. That's kind of the ethos of the tech industry. So it's very, very decentralized, very democratized. What privacy engineering is attempting to do is saying, okay, democracy is great, but even democracy is not fully democratic. You know, even democracy in most of the bigger countries in the world is representative. In New Zealand you have a parliament as we do in Canada, in the UK. Although the UK is an interesting example at the moment. In the US and we have a federal system where you have the Senate and the House, they don't get elected at the same time. So there is a filter so that before people have complete control their elected representatives weigh in. Privacy engineering serves as that elected representative in the middle, that cooling saucer that makes sure that everybody has at least a common layer of privacy discipline. As an example, let's assume you work for a major retailer, right? And you have a website that sells kids supplies. You have another website that sells pet food, and then you have another website that sells furniture, and these are all part of the same empire. Any number of websites can fit that general description. However, they all might have different products. They have different customer bases. They'll accept different modes of payment, but if they all function in the same market, they have to have the same sort of privacy controls. For example, if you sign into website A, website, B website C, you might expect that if you say no to one website, in terms of using your email marketing purposes, that note should apply across the board. Like if you say no to me to send you promotional materials for pet food, you have reason to expect that that note should carry over across it's the board, or if it doesn't, you need to know that. So there needs to be a centralized way to build, sort of what I call, a consent service that records exactly what Shane has said yes to for website A website B, website C. You can say Yes to everything, you can say No to everything, you can say Yes to one and No to the other two, etcetera. So that needs to be centralized. So what a privacy engineering team will do would be to build a consent service that each of those three websites can plug into and have their Yeses and No's recorded accordingly. So that is one central place that decides whether you get a marketing email or not. That's one example of privacy engineering. Another example of privacy engineering would be. Making sure that if we, as a company have agreed to keeping you anonymous, if we, for example, run a business that builds mapping software, as an example, you might see all these cars that go outside, they take pictures of maps, so you can know exactly what the different streets of the city look like.Well, a privacy engineering team would come along and build software that greys out people's windows in the front. So, if there was a human being standing outside, their face gets greyed out. We could build software they greys out license plate numbers, so you don't know exactly what car is parked where. You can also grey out the front porch so if there's a kid playing outside, you can grey out their face. A privacy engineering team will build out technologies and tools that are privacy preserving that the core product team can use so then the core product team can build out things that make the company money and innovate quickly and be fast and be agile and be creative it can be disruptive, but have sort of a, backstock sort of a safety check at the back that preserves people's privacy. Tthese could be techniques to delete data in time. Make sure that there is. Marketing consent respected, make sure that people have the right controls and right disclosures available at all times, that personal identities are concealed. All of that has to happen somewhat centrally because this central team will have visibility into different businesses, different uses of data, and make sure that everybody plays by the same rules. It's kind of a long answer. So let me know if I can clarify any of that.

14:14 The costs and benefits of privacy engineering

14:14 Shane: It sounds wonderful, but isn't it going to add an overhead cost and slow down the building of these great new products?

14:23 Nishant: Well, so yes and no, I mean, first up there is a way to do the right thing in a way that doesn't slow you down. But before I get to that, my favorite computer science professor at college had a poster outside her door. She said that days of programming will save you hours of planning. And it was a fantastic saying, and I lived it then, I believe it now and she was totally right. If you don't invest a little bit of work in the beginning to do privacy engineering right you will have to invest that at the back end. I have a friend who works at a company that serves a lot of entertainment in real time to people on demand and she found out that a bunch of her teams have been collecting data from their customers in the background without knowing it, there was no malice intended here. That data was being collected in the background and that data ended up in the company servers, which then got copied to other servers, which then got plugged into an ad server and then was used by the AI algorithms to serve ads and as a result of people who didn't know that they were giving data we're now getting ads based on the data that they didn't know they were giving. And that resulted in a complaint, and that resulted in a major audit and that resulted in a major cleanup operation for which they had to hire 17 new engineers. They had to take down entire clusters offline. The revenue they lost with all of that, they could have basically done work over a three hour window at the beginning of the project and would have saved them, I don't know, three quarters worth of work at the backend. The question is not, does it take you more time, the question is what are you willing to spend? Are you willing to spend a little bit at the beginning or a lot at the end? Those are your choices. But I mean, I would also push back against the notion, at all, that it's going to cost you time and slow you down because I gave you the example earlier of the three different silos, three different websites that sell retail groups to customers. If you had three different people building that consent service, you have duplicate code, you have people basically doing potentially different things, neither of which might comport with the law. Instead you could have one team building a unified consent service that has to talk to legal, that has to talk to PR, that has to be on the hook to make sure the right thing happens and if something was wrong, then one team has to fix it for everybody in one go. You will save a lot of time, and basically I have the data to back it up. I have run these teams, I've run these centralized teams. And it always starts the same way, Shane, people initially push back, then they see you doing the right thing, and then they want to do more and more of the right thing, because they don't want to have to be on the hook for the privacy technologies at the backend. And they know that you are doing the right thing and all they have to do is plug into your tool and they are okay from a privacy perspective.

16:46 The costs of privacy engineering pale against the costs of privacy and security breeches

16:46 If your listeners take away nothing else, let them take away this: Yes, it will cost you maybe 5% more at the beginning, but you will save a lot more time on the backend. So I would invest early. I would invest often, and if you do it right, you can go to the customer and say, Hey, look, we did this correctly. And there's a Price Waterhouse Coopers survey that I referenced for my LinkedIn course recently that I taught where 80% of the customers said that if they believed that their data was being protected, they will actually give you better data. So you know, all these online surveys where people just click through them and they give you bad information, imagine if you actually got good information. You actually now have good data, you could use that data to make better products and serve your customers better and make more money in the process. So it's a win-win if you do privacy right - so don't let the initial cost bog you down.

17:31 Shane: So privacy is one thing. You mentioned your passion for causes that are bigger than the next income report. Can we expand this and talk about ethics in tech, where are we at? Because there's a lot of high profile, rather disturbing things happening.

17:47 Ethics in tech

17:47 Shane: Yeah, you know, it's interesting. I was talking to my wife in preparation for this conversation and you know, there's a lot of interesting things happening when it comes to technology and I remember when she and I went to Mumbai for the first time after we got married and, against my advice, she ate street food and promptly the next day fell sick and she told me it was totally worth it cause the food was that good. I took her to the doctor and doctor said, this is just a simple case of the food not agreeing with you. And it all went well. And then at the end, when it was time to prescribe medicine, the doctor basically said, okay, I'm going to give my assistant your prescription, and the doctor decided to scream the prescription loudly across the room with the door open, and basically read all of my wife's vitals, her weight, her height, et cetera. And the door was open, so there were like five people sitting outside. So now all of my wife's personal information was public just like that. And you know, it's a different culture, it’s a different expectation of privacy. All of this is by way of saying that a lot of the high profile stories we've heard basically amount to a cultural clash. We have assumed in the tech sector that we can do amazing things, we can build amazing products and people will love those products. You know, you have to admire, I don't know of any other industry in the last 20 years that has created this much wealth, this much access, this much opportunity, this much economic mobility. The last time, the tech sector was really devastated by a recession was the one we actually caused in 99. Since then, if you like even the 2008, 2009 crash. Yeah. Tech industry mostly came out of it okay. It was the other industries that really got devastated by it. But I think because of that, the tech industry has believed that because our intentions are right, you know, for the most part building good products, trying to connect people, giving people access universally across the board, you know, everything we do is automatically going to be correct and there really shouldn't be a whole lot of process, there shouldn't be a lot of hindrances along the way. And I think we're kind of reaping the rewards, if you will, of that.

09:38 Ask yourself – how would you feel is somebody did to you what you are doing to your customer?

09:38 When it comes to ethics, it's absolutely important to ask yourself, how would we feel if somebody did that to us? You know, how would we feel if somebody suddenly decided that major tech companies had a cap on profits or major tech companies couldn't make more than a certain amount of money, or if we made a certain amount of money, you know, a certain portion of that had to go to people whose data we're collecting. I mean, if cavalier proposals like that really came to fruition without any real whiplash, or any real accountability we wouldn't like it. I think when it comes to tech ethics, I think it's important to understand we need to be more transparent and we need to make sure that people understand what we are doing and why we are doing it. I think the more we have that conversation and the more we have it transparently and the more we do it in a way that people understand what the value proposition is, I think that's going to be really important. The other thing I would mention from a tech ethics perspective is that we shouldn't wait for the laws to do the right thing. I mean, I've always told my teams that we shouldn't use the law as an ending point we should use it as a starting point. We should always say that this is what the law requires us to do, but let's assume that the law can never fully appreciate the complexity of technology. You know, from a government perspective, it's impossible for anyone to fully understand. In fact, I would struggle to understand exactly everything that every product manager, every engineer at any company is doing, that I work for. These things move pretty quickly. So start with principles that go above and beyond the law and really demonstrate that we're trying to do the right thing.

21:01 Codes of conduct for software engineering

21:01 Shane: Thinking of those principles. I know that organizations like the ACM, the IEEE Computer Society, they have codes of ethics, codes of conduct, but how many of us actually have ever read them? And how many consciously take them to heart?

21:15 Nishant: Yeah, so I do read those ethics by the way. I do, in fact, I was part of the ACM when I was in college at Arizona state. I think it's one of those things where there is a need, in my opinion, to have that cross functional collaboration at the industry level. I think academia and industry have deviated a little too quickly, a little too far, in my opinion, to the point where a lot of these principals oftentimes don't appreciate how complex it is to build these products in real time. Cause remember the choice you make at a company is if I don't do it quickly, my competitor will, and we're going to lose market share. Those are the impulses. And I think you can also make the argument that people who move quickly end up getting a lot of market dominance that they ended up not giving up and the people who finished second ended up never catching up. So there is a need for a level of cross-functional connection at that level where doing the right thing gets you a level of visibility and gets you a level of influence because otherwise you're always going to reward the fastest and not the most ethical player. So that's number one. I think it's also important for industry to speak and partner with ACM, with IEEE, more often, and really give these principals credibility and say, Hey, look, these principles actually mean something. You know, in my time, the last few years, I often try and take every opportunity I can to speak to students. I've spoken a couple of times at the Carnegie Mellon Privacy Engineering Conference and again, and again, I get asked by the students saying, what would you do if your employer asked you to do something that is completely unethical or illegal? It's really amazing, like when I was a student, all I wanted to do back in Missouri was get a great job and make sure I graduated without any debt. When I drove out of Kirksville, Missouri, no debt, with a car fully paid off with $5,000 in the bank I spent, like I was the King of the world and now these students are actually thinking about ethics. They're thinking about how do I build a career that is meaningful. And by the way, these students are in debt as well, they are in a hundred thousand dollars plus debt. So we are in a bit of a unique moment where there is an opportunity to do the right thing, and I think it's important for IEEE and ACM, et cetera, to really gain a foothold into these businesses and say, Hey, look, you are in trouble. There's a tech lash coming. Let us give you these principals , let us work with you in a way that you can build products. And let us give you our imprimatur and say, Hey, these products are actually ACM safe or IEEE safe. And let industry come back and say, Hey, we will fund research and we will have our hands off and we won't try to influence the outcome of this research and help us evaluate products that have been built or things that have been rolled out and the impact they've had on society and let's use that combined partnership as a way to build these things out better. And I recognize that some of these proposals may not sound very popular either too. The ACLU or the ACM of the world or the industry partners that pay my bills. But the fact of the matter is if we don't do the right thing, now we're going to have 50 different state laws, we're going to have a GDPR too. We're going to have audits that are extremely invasive and we're going to have a population that doesn't trust us. I mean, if you have Steve Bannon and Bernie Sanders, both say that the tech industry is getting powerful, I mean, these two don't agree on anything, but they agree on the fact that the tech industry sucks. I mean, we've done the impossible. We have unified the far right and the far left. I mean, they do the right thing now, or somebody else is going to do the wrong thing to us. That would be my recommendation. Let's not be scared of ethics, let's work, lean in and define the ethics.

24:28 Shane: Nishant. Thanks very much for that really interesting conversation. If people want to continue the discussion, where do they find you?

24:28 Continue the conversation

24:35 Nishant: So I have a pretty active LinkedIn profile. I haven't published in a few weeks because I just published a course on setting up a privacy program at LinkedIn. So I have a pretty active LinkedIn profile. I also have a pretty new Twitter account. My vision, if you will, is to sort of really be a spokesperson for setting up a privacy aware culture, setting up a culture that is more responsive to the environment we live in. If the world resembled my ethics, I would make sure that it respects people's privacy it respects fair play, don't cut in line, don't use somebody else's data without their permission. Don't be cruel to animals. Don't destroy our environment because we borrow it from the kids who have yet to come and occupy it. So those are the sensibilities that I'm pushing for. So I would love to collaborate with people to help make that happen. Whether it's speaking to people, whether it's setting up communities. Whether it's hiring people or being hired by people. Those are the sorts of things I'm going for. Connect with me on LinkedIn, connect with me on Twitter. I'm sure my email address will be available as part of the QCon endeavor. And also I'm always looking for people to hire as well, by the way. So I would be remiss if I didn't at least put in a small plug for hiring, whether I'm working in my current job at Uber or elsewhere. And I'm always looking for people as well, because I want to hire people who are ethical, who are honest, who are intelligent, because I think good products come from good people. I've never had somebody unethical or immoral build good products. So connect with me in all of those ways. I would love to keep the conversation going.

25:56 Shane: Thank you very much, indeed.

25:58 Nishant: Thank you.

Mentioned:


 

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and the Google Podcast. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT