BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Anne Currie on Organizational Tech Ethics, Including Scale, GDPR, Algorithmic Transparency

Anne Currie on Organizational Tech Ethics, Including Scale, GDPR, Algorithmic Transparency

On this podcast, Anne Currie joins the tech ethics discussion started on the Theo Schlossnagle podcast from a few weeks ago. Wes Reisz and Anne discuss issues such as the implications (and responsibilities) of the massive amount of scale we have at our fingertips today, potential effects of GDPR (EU privacy legislation), how accessibility is a an example of how we could approach tech ethics in software, and much more.

If you want to find out what every software developer, data scientists or ops should know about GDPR, download our free guide "Perspectives on GDPR" .

Key Takeaways

  • Ethics in software today is particularly important because of the scale we have available with cloud native architectures.
  • Accessibility offers a good approach to how we can evolve the discussion on tech ethics with aspects that include both a carrot and a stick.
  • Bitcoin mining power consumption is an example of something we never considered to have such negatives.
  • The key to establishing what we all should and shouldn’t be doing with tech ethics is to start conversations and share our lessons with each other.

Show notes

What’s the focus of the work you’re doing today at Container Solutions?

  • 2:25 It leads from the work I was doing on my previous startup, getting people into the cloud.
  • 2:40 A lot of the past year was spent talking to companies who are moving into the cloud, and asking why they did it, what they got out of it, and what went well and not so well.
  • 3:10 One of the things it led me to realise (which wasn’t obvious at the start) was that we need to look closely at ethics in tech right now.

Why ethics, and why today?

  • 3:25 It comes out of what I realise by talking to others in moving to the cloud.
  • 3:35 One reason was that they could get massive scale, and the other was they could move much more quickly.
  • 3:50 People could be experimental and deploy things 500x faster - people could go from an idea in their head to a scalable working product.
  • 4:05 It’s a major ethical issue, because it isn’t just one person doing this, it’s a large number of people.

It’s the impact?

  • 4:30 Yes, exactly - there was a German philosopher called Emmanuel Kant, who wrote about how important universality and how they scale.
  • 4:50 For example, if I drop litter, it’s unfortunate. If 7 billion people drop litter, that’s a major issue.
  • 5:00 As things scale, they become different - and ethical concerns grow as well.
  • 5:15 It’s important that if you change people’s behaviour, that it goes in a good direction and not a bad one.
  • 5:20 That means when you get things out there quickly, you may not think of the ethical concerns.
  • 5:30 We’re going to look at Facebook first, and asking: is that behaviour ethical or proper, particular with fake news and the elections.
  • 5:45 Facebook has both enormous scale and until very recently had a motto “Move fast and break things”.
  • 6:00 They have been able to move extremely fast - and the ability and the scale means there are ethical implications.
  • 6:30 We are aware that there is a level of bad behaviour at a niche scale, but at the scale of Facebook and Google we have a risk here.

Are there ways of reasoning about this?

  • 7:00 We are left up to our own devices - it is new; we need to work out what tech ethics means.
  • 7:20 The Kant approach was to ask whether the world would become a dystopian future.
  • 7:30 One person’s dystopia might be another person’s utopia.
  • 7:40 We need to try and find out what works.
  • 7:50 We think Facebook has had impacts on democracy, but we’ll see worse things happening in future.
  • 8:00 WE’ve never had the scale of the web or the speed of delivery.

How do you define tech ethics?

  • 8:20 It’s about how we can take pragmatic steps to achieve good with technology.

What about SSL - it can be used for good and bad?

  • 8:50 It’s an interesting example of an ethically neutral technology.
  • 9:05 It can be classified as a bad use case or a good one, depending on the user’s use case.
  • 9:40 It’s still fundamentally the folk who use it who are immoral or not.
  • 9:50 If we start to impose morals on the platform, we’re looking in the wrong place.
  • 10:00 The internet has been based on the fact that you can’t be sued for its content if you are providing the service.
  • 10:10 Facebook, on the other hand, has been making a lot of money by providing its service.
  • 10:15 It has been difficult to stand by this intermediary liability when they have been doing this.
  • 10:20 But it is incredibly important with the growth of the internet.

What’s the prior art?

  • 11:10 A good thing to look at is accessibility - it’s an area of ethics in tech that has been worked on for decades.
  • 11:25 I have a visual disability, and being able to zoom in on websites means that I can use them.
  • 11:35 We now have standards from the likes of W3C, and a lot of carrot and stick from Google.
  • 11:45 We have known for a while that our Search Engine Optimisation (SEO) has depended upon us following these accessibility guidelines.
  • 11:50 We’ve had both standards, and clear financial benefits from doing so - and that’s had a really good effect.
  • 12:05 It’s a matter of following good rules up front, and careful thought from the early stages of provide.
  • 12:30 I’m increasingly believe that this is how tech ethics should work.

What is GDPR and your views?

  • 13:00 I’m torn about GDPR - a set of privacy and ethical data/privacy behaviours that will come into effect in the EU from March this year.
  • 13:20 Theoretically, they’re good - they are rules about privacy, protection, right to be forgotten - and that algorithms that apply to you are fair.
  • 13:40 It’s all stick and no carrots. 
  • 13:50 With accessibility, there was an openness to discuss the ideas and implementations - there wasn’t a need for Google to get in on it.
  • 14:00 I’m worried that GDPR is too big a stick too soon, and it might be discouraging discussion.
  • 14:20 The fines for infringing GDPR are millions of dollars, and right now we don’t know how aggressive the EU will be in applying those fines.
  • 14:30 So everyone is frightened - and when you’re frightened, you don’t share information.
  • 14:40 At this stage of ethics, we need to share information.
  • 15:00 In comparison to the accessibility evolution, where there was no stick and big carrot, it was worth people saying what they were doing.
  • 15:30 For ethics, we haven’t defined the standards yet.

What are implications to software for GDPR?

  • 16:10 We need to be concerned with the way we write algorithms - but also with machine learning, we’re not really writing the algorithms.
  • 16:20 We have a training dataset - so we’ll have to develop skills to select training datasets to avoid surrogates.
  • 16:30 I think tools will evolve, and testing in non-critical environments will give us a heads-up in case we’re getting unfair decisions.
  • 17:25 What people will eventually say is that when data is anonymised to a certain degree, then it’s not yours any more.
  • 17:45 Otherwise, we’ll be unable to do anything with machine learned data sets.
  • 17:50 Another possibility is that we’ll use made up data sets, like the learning data against Grand Theft Auto.

What was the discussion behind the tech ethics track at QConLondon?

  • 18:30 I wanted to get someone to talk about GDPR, but I couldn’t find anyone who was willing to talk about it.
  • 18:40 We wanted to get people to talk about diversity, recruitment, training, new engineers, algorithms - it’s such a wide field.
  • 19:00 People were a bit frightened to talk - whether it would be socially acceptable to talk about ethics in tech, or what others might say.
  • 19:20 We have got people to talk, but there was a huge social barrier in getting people to talk about this stuff.

What did you mean by ‘bitcoin is a terrible user of dirty energy’?

  • 20:15 In my blog post Ethical vs Unethical[https://container-solutions.com/ethical-vs-unethical-ethical-vs-inadvertent/] I used Bitcoin as a great example.
  • 20:30 As a segue, I was interested in containers to improve data centre utilisation by using as little as needed but no more.
  • 20:50 The average on-premise data centre is utilised 10-15%, whereas Google are around 70% efficiency.
  • 21:05 If we were a tiny industry who used hardly any energy, it wouldn’t be a problem.
  • 21:15 But in the tech industry, data centres use about 2% of the world’s energy usage.
  • 21:30 It’s an issue because we’re at a huge scale, but we don’t have a good idea of what’s going on.
  • 21:50 We may be aware and eco friendly, but we don’t know how we are contributing to that because of our own lack of knowledge.
  • 22:10 We may not realise how much compute power we have powered on but not in use - and with the scale of the data centres, that could be a lot of electricity.
  • 22:25 Bitcoin is almost the most extreme example of that.
  • 22:30 Bitcoin itself is neither good nor bad.
  • 22:40 In the UK, we are suffering in that housing is the most valuable asset for people holding onto their money.
  • 22:50 But people need to live in houses - which is a necessity of life - to be in an asset bubble.
  • 23:00 It’s much better if capital assets are something that people don’t have to have, like diamonds or bitcoins.
  • 23:10 From that perspective it would be of benefit to the UK if housing stopped being part of that assets.
  • 23:20 The problem with Bitcoin is that it consumes an incredible amount of energy to produce at scale.
  • 23:30 If you’re only talking about a couple of Bitcoin, it’s fine -but if you’re talking millions of bitcoins you are now using an extraordinary amount of energy.
  • 23:40 Bitcoin production worldwide uses about the same as the size of a small country, like Iceland.
  • 23:50 It’s predicted that at the end of 2018 it will use as much as a major country, like Italy.
  • 24:00 That vastly outstrips the pace of moving over to green energy.
  • 24:05 All of that power is coming from burning coal - it probably wasn’t planned, but it’s an inadvertent issue with the scale that Bitcoin has achieved.
  • 24:25 It’s a great example of something that wasn’t thought through in advance and now has negative consequences.
  • 24:50 Some of these things are impossible to predict.
  • 24:55 We need to spot them when they happen and then go back and do it differently next time.

What are some of the practices we could put in place?

  • 25:35 We need to be aware that there are such things as ethical concerns.
  • 25:45 Are we hosting this in an eco friendly way?
  • 25:50 Are we training our algorithms on well regulated test data sets?
  • 25:55 Are our hiring policies socially aware?
  • 26:00 All we can do is wave our hands round a little bit and talk about it.
  • 26:10 We are at the stage of talking about it.
  • 26:30 We need to be able to talk about it.
  • 26:40 The GDPR is a means of imposing a standard, but we know from our sector that imposing standards doesn’t work terribly well.
  • 12650 We are better off talking about our practices, and then adopting them more widely.
  • 27:00 We aren’t talking about it yet - so the next step is for everyone to talk about their best practices.
  • 27:05 (without hopefully getting a £10m fine from the EU)

Is it time for a hippocratic oath in software?

  • 27:20 I think it is too early - because we don’t yet know what it is that we should be doing.
  • 27:25 I think we need one in the long run.
  • 27:30 What we should be doing now is figuring out what a Hippocratic oath might look like in the technology sector.

What role does privacy play in tech ethics?

  • 29:00 Privacy is up for debate - I don’t know how it fits into ethics.
  • 29:10 I’m not pro-privacy - I think it’s something of the past.
  • 29:30 Having people not know things about you - that’s one of the areas that might look dystopian of a certain age now might look utopian to people who are younger.
  • 29:40 I could imagine a future in which everyone knows everything about everyone - some might consider that utopian, some might consider that dystopian.
  • 30:05 Our vision for what is utopian or dystopian could change.

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and the Google Podcast. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

BT