BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Presentations Effective Ethics for Busy People

Effective Ethics for Busy People

Bookmarks
43:08

Summary

Kingsley Davies talks about the story of Good Tech Conference, a conference he founded in 2018, the findings uncovered from it, and gives some concrete tips and techniques for attendees to evolve to work and live in a more ethical way.

Bio

Kingsley Davies is a partner at Underscore Consulting, who build scalable systems using functional programming to deliver value to their clients. Additionally, he co-founded GoodTechConf which is a conference about ethics, technology and social responsibility as well as co-hosting the future tech podcast Breakpoint Radio.

About the conference

Software is changing the world. QCon empowers software development by facilitating the spread of knowledge and innovation in the developer community. A practitioner-driven conference, QCon is designed for technical team leads, architects, engineering directors, and project managers who influence innovation in their teams.

Transcript

Davies: A quite busy room, which is both good and concerning to see on an ethics track. Concerning because a busy room on an ethics track suggests that we're heading towards the cliff at pace.

Let me tell you a bit about me, I'm Kingsley [Davies]. My handle everywhere is kings13y with 13 in the middle of my name just to make it a bit trickier for people to find me, and also, the handle for the conference that I run that's in Brighton called Good Tech Conference. As I say, a busy room is both a good thing and a concerning thing. I'll give you a bit of my background too. At the start of the year in January, my partner and I opened a local cafe called South-By-West. Coming from the tech community, I immediately had the reaction of this is so close to South by Southwest, we can't possibly do this. My other half's got nothing to do with technology at all and immediately rectified my mistake. We're on a road called South Street right by a station called West Worthing station, hence the name South-By-West.

Part of the principles underlying the coffee shop that I run, it's a real bread bakery and cafe. The principles there, though, are all around reuse and ethics. We tried to bake ethics into the core of the shop. When we fitted out the shop as far as possible, we've tried to upcycle all the materials in the shop. We've got traceability of all the goods that come into the shop too and everything that we, as far as possible, we've tried to keep locally sourced. That's one thing that I'm not here to talk to you about today but I can talk to you at length about that if you want to afterwards.

Otherwise, I'm a partner in a consulting company called Underscore. We specialize in functional programming, open source software, predominantly in Scala. Also, ran distributed systems and distributed teams, which again, I'm not here to talk to you about today, but this is my time when I can tell you a bit about my work stuff.

BreakPoint Radio is something that I've not managed to keep up with over the past couple of months because I have been out in a coffee shop, but it's a podcast that I've run for the past couple of years. This is where we start to get into the stream of ethics in that, the podcast itself was typically monthly when we'd publish it. I was the co-host in it, and we'd typically talk about future technology or emerging technology, and as part of that story too, some of the implications of the emerging technology that we were seeing on the horizon. The implications around diversity and ethics and where we could generally see some of the cutting edge, possibly bleeding edge tech world heading.

Specifically, this segues into what I'm here to talk to you about today, which is Good Tech Conf. Good Tech Conf was a conference that had its first outing last November down in Brighton on the south coast. It was a two-day conference around ethical technology for Tech For Good, so socially responsible technology too, and myself and two collaborators put that together and ran that as a two-day conference as a bit of a discovery process, a fact-finding mission to see what was going on in the broader world of tech ethics and tech for good social technology.

What Can I Do?

One of the underlying things behind putting the conference together was having done the podcast for some time and seeing the general direction of travel within technology, there's this unnerving or underlying sense of, "Oh, we're going off the cliff. Ok. What can I do? How can I help? Where can I get involved?" I sense that that is possibly a question that many people who are here today are asking themselves too and hoping that they can get some starts towards an answer towards too.

I think it's a general thing of, "I can see the direction of travel, I'm uncomfortable with where things might be heading. How do I get involved in the Tech for Good world or what can I do to either turn the ship around or try and make things slightly better bit by bit?" That was where the conference came from. We looked to the general environment and thought, "What can we do? Where do we get started?" Hence it was let's do a fact-finding mission. Let's do a recce of what's out there in the social good world and in the Tech for Good world.

Good Tech Conf (GTC)

Good Tech Conf or GTC because Good Tech Conf is quite a mouthful, and we're in technology so we love acronyms. GTC in a sentence, a not for profit conference on the ethical implications and social impact of technology. That's a lot of words in a sentence, isn't it? Trying to bake that back down, what are the ethical issues implications of the things that we're building? What happens when we do tech and it goes bad, when tech goes wrong, and when it doesn't go wrong by mistake, but possibly when it goes wrong by design? What about when data is the product and we are part of that product and when the actual product is used to use our data to influence us, and to do things which we might otherwise not consider doing.

When the whole business model and the product is using our information to influence us and drive us in specific, targeted ways, is that a world that we want to be in, and what can we do to maybe rally against that or put safeguards in place to do something about that?

Also, the social impact of technology. I think one finding that we discovered quite early on is that the world of technology and possibly deep technology and tech conferences, and the world of the third sector, I don't know if this resonates with people, but Tech For Good or the charity sector who are using technology are both massive worlds but very seldom diverged. There's very little collision between the two, and the problems in one of those worlds are often not necessarily what the - say the charity world - the problems that they face are seldom what the cutting edge technology domain is trying to address, but we'll get to that a bit later on. That was really meant to be GTC in a sentence. It was intended to be a discovery and fact finding mission.

Why

As soon as I mention to people, "I've done something, it's around ethical technology in Tech for Good and we're going to see what we can do," the first question that normally comes back, and it's slightly people I suspect being partly on the defensive is, "Why? Why are you doing that?" That's the accusation with the pitchforks and one thing and another. Why do this at all? It's a lot of energy and time investment. Why bother even starting down this path?

Maybe we start with a slightly bigger question of why in the large. One driver towards it is the idea of getting meaning from things. I don't know if people have read this book or not, but this is a book that was written around someone's Holocaust experiences. What they were keen to get from those experiences is what people's underlying motivations are or drivers who survived that. What are the things that survivors from the Holocaust have actually taken from that and steered their ship through it? What are the lights at the end of the tunnel that have kept them going during their darkest times?

The book is split into various sections, the first section is a preface giving you the overview of what's to come, then there's a set of stories that come from both the author and his associates' Holocaust experiences, and then the end of the book, it goes into some slightly different areas around psychoanalysis and around a particular thing called logotherapy that this particular author was keen on pushing as a means of doing psychoanalysis.

Without going too deep into that side of it, one of the overarching messages that comes from the book is that people have control over how they react to events. They might not have complete control over the events and the scenario that they're in, or what is happening to them, but how they react to that and what they take from that, people are responsible for and are in control of.

Another main theme, as the title suggests, is that actually, the author found that people are driven by a desire to find significance in their life. There's lots of different slightly existential ideas of people having this will to power, or will to different things. The author himself found that it was closer to the idea of Kierkegaard that people have a will to meaning. People want to find some level of significance in their life, and that is what drives them forward. From the author's perspective, he certainly found that there were key anchors that people pin that significance to.

Typically people find significance from work. If they have something within their work life that they find inspiring, that can drive them forwards. One of the implications of that is that it's often seen that when people retire from work, they find it difficult to find that drive to get up and go and do things until they find, say, pet projects or other things that give them a sense of significance in their day to day which they can then drive towards.

Work was one domain in which he found people were able to find a sense of significance in their lives. Another one was this sense of adoration. Adoration or love, particularly for one's family, so the idea that actually a lot of the Holocaust survivors were driven to get through that experience because they imagined what it would be like to get back with their family at the other side of it, and this idea that their sense of family gave them a sense of significance, and that that sense of adoration for their family is what drove them through.

The third bucket, or the third category, which the author suspected was something that people found significance within was the sense of bravery. The sense of taking on adverse times and surviving it, getting through the challenge, making it to the other side of that challenge. Three distinct buckets which the author found people found as hotspots which gave significance to their lives, which then led to this idea of logotherapy, of a form of psychoanalysis.

Meaningful Work

This is reinforced with recent research to or at least recent journalism which has suggested that - and this is from last November - that 9 out of 10 people are willing to be paid less if they find some sense of meaning or significance or drive within the work that they do. Having that sense within your work environment can more than compensate the monetary remuneration that you get from this work. The slides are all going to be available afterwards. There's links to all of the articles in the slides too, but feel free to take notes and take snapshots to if you want to take them away with you.

We've got this sense of meaning, this underlying value system or drivers that make us want to do something, give us some sense of significance within our day in day out and make that sense of having something to get up for in the morning.

Privilege4Good

Next, we've also got this notion of Maslow's hierarchy of needs. This is something that I got into in the Good Tech Conf as I was introducing the conference too, that it was actually from a position of privilege that I was able to help drive that conference or help steer some of that conference anyway, in that, actually, once your physiological and safety concerns are taken care of, then you actually have the headspace or capacity to think about finding meaning.

The often published pyramid ends at self-actualization, but in later works from Maslow, it also added the additional tier of transcendence which is above this, which is the notion that we're doing something that is beyond ourselves. It's not just something for self-actualization. It's not something just for myself and to improve myself, but something that goes beyond my physical domain, and it's something that can actually transcend me and potentially be lasting beyond me too, something that transcends my desires and actually affects the society writ large.

During Good Tech Conf, there was a strong sense that actually putting the conference together was something that was slightly transcendental because it affected a population beyond those who were putting the conference together, and it was a massive privilege, we were in a position of privilege to be able to do that. Privilege isn't a bad thing, it's how you then use that privilege which is the important thing.

Why Tech? Why Now? Influence and Impact

Arguably, anyone who's probably attending this conference and in this room is similarly in a position of slight privilege. That's no bad thing, it's how we then use that leverage that we've got to try and make things better.

Let's unpack meaning then. We've got an idea of the underlying drivers and the transcendence and the value system. Why put this conference together around tech and why do it now? Influence and impact. The main driver is the idea that being able to influence en masse is something that digitized companies are now able to do at speed and scale. It's the idea that actually if I have access to your Facebook feed, I can then populate other items within your Facebook feed in such a way to try and influence your decisions, influence your outlook, and maybe try and steer your actions accordingly off the back of that, which has always been one of the many goals of marketing, but it's the fact that now it's been digitized that these decisions can happen and influences at such a speed and scale, that digitized influence and impact is actually has a massive influence and impact globally.

Also, the “Why now?” We've got speed scale, there was also the year of the scandals, and we saw this very much through the podcast that we were talking through, that we were seeing the Cambridge Analytica scandal, the Uber scandals, the Facebook scandals, too, of populating people's data feeds, things like Volkswagen's scandal too where they influenced the output of the readings of the eco readings from the car, the carbon outputs from the car, and that was all done in software. From what I remember, the software developer who worked on that was slightly held to account for it, even though he was just executing whatever his product owner or product line had suggested needed to be executed to make the car get through those tests.

Project Maven, if people are aware of it, was something that was in the AI domain, which actually was an AI project sparked from Google, but was adapted and used then for things like autonomous bots, which were sent out to do the spray bombs in the Middle East too, and there was quite a reaction against that both within Google and outside of Google. I believe they then dropped the contract off the back of that. If anyone wants to know more about that too, I found out about that from the Coed Ethics conference, but we'll get on later on to some links that can take you down all kinds of rabbit holes about how to find out more about this stuff.

Influence and impact, data being the new gold. Whoever owns all of our data and has the fastest algorithm and then can republish things off the back of that, they have the keys to the kingdom. Data is the new gold, the new oil, the new currency. We've got influence and impact and that being digitized and having speed and scale.

Access and Accountability

Access and accountability were two things that we covered numerous times over during the podcast. This reflects onto the previous use cases that we mentioned, in terms of both skews and what is the truth. Who's accountable for that truth and who's influencing my decisions? Is there actually someone paying for me to get a level of news fed to me, which influences my decisions? What's the accountability especially when things get automated? In terms of access and accountability, we're also looking at things like autonomous cars. Who actually seeds the software for autonomous cars and what are the influences and literally the drivers behind those cars. If they have to perform an activity or an action rapidly, who then is accountable for accidents or mistakes that happened with autonomous vehicles? Not just autonomous vehicles, but autonomous software in the large.

A big theme at the moment is in terms of diversity in the tech workplace. I've not been around for the past couple of days of QCon so I've no idea what their diversity skew has been among speakers and attendees, but I think generally from previous tech conferences, the diversity skew is very heavily in one camp. This was something that we wanted to try and talk about during Good Tech Conf about how would we open the doors to that. How would we maybe make access to technology and have a more diverse workplace?

We know Mel Conway and Conway's Law around organizational design. If you drop the E from that, that feels like there's almost ML Conway in that AI and ML systems are built and seeded with biases based upon the people who are building those systems. If you have a team of people from a very non-diverse background building these AI systems or machine learning systems, the biases within those systems are fairly naturally likely to take on the properties of the people who are building them.

Going forward, diversity in the workplace is more going to be a need rather than a want. Rather than it being a marketing tool and something that companies want to advertise, it should really be something that is a necessity requirement.

Green Tech & Tech4Good

Another theme which we were keen to cover during the conference is around Green Tech and Tech4Good. I'm not going to go too deep into Green Tech, because if you've been around for the past couple of days, there have been some fantastic talks about green tech already.

I think Paul and Jason, from what I can tell from my Twitter stream, did a real standout job at talking about Green Tech and the effects on the environment about data centers, power usage, and how that affects the polarized caps too. I was very lucky to have Paul talk at my conference too and he did a great job there as well. There's been an increasingly salient characteristic of hosting.

I've done recent talks about Bitcoin and blockchain. It's interesting that the last figures I saw were from roughly six months ago, and showed that Bitcoin miners currently consume the same amount of electricity. If there was a global chart, they'd be the 38th most power-hungry consumer of all the countries on the planet of electricity consumption. You're effectively swapping electricity for oil or gold in this case. The power to commit the transaction and get the reward, they'd be the 38th most power hungry country.

On the Tech4Good front, I found that that was a very different world to the tech conf community that I was used to working within. We wanted during the Good Tech Conf to try and figure out what is happening in that world. How can people get involved? Are there areas where people can get involved too?

Inspirations

Two key inspirations were conferences that we attended before running our conference. One is a conference that's been around for several years now called Meaning Conference, happens annually in Brighton, and it's a single-day and multi-track conference around, it's really tailored towards the underlying meaning within business, so, different business models that people can adapt and adopt to try and put that idea of meaning beyond just profit margins into their businesses.

If you've not attended one before, I would massively recommend it. That's probably one of the few conferences where I was listening to talks where I started to well up during talks to, as people from the UN were giving talks about how they were trying to adapt technology, and all kinds of atrocities and horrors that they'd seen. Also Coed Ethics was a key inspiration to putting our conference together. If you've not been to the Coed Ethics site, I strongly recommend doing that too. There's so many resources listed there and some fantastic blog posts which give you some key ideas of where you can go with technology going forward to try and keep it ethical as well.

Good Tech Conf (GTC)

Good Tech Conf was run in November as a two-day conference, two day single-track conference. You can probably tell from this somewhat muted graphic in the background that we based it on, at Brighton seafront with pier and the wind farm out in the distance too. By design as well, we intended to run this conference but outside of London, because if you want to attend quite deep tech conferences or conferences around meaning, you are able to source them within London. Outside of London is a bit trickier. Given the fact that I'm based on the south coast, I thought this would be an easier way to get into this environment and to try and get people in and along to start talking about things and try and distribute some of the knowledge out of London to around what's going on within tech.

Based on Brighton's coast, and say, got the wind farm in the background too which by the evening was felt a bit like they were coming to get us. The second day was around workshops and hackathon too. Day two of it was a mini-workshop, and the afternoon was hacking proof of concept ideas which augmented some of the ideas discussed on day one. We're also picking up tickets - if you go to GitHub, you can easily find repositories with low hanging fruit for social impact projects there, and I'll add the link to this later on.

Also strategically timed the conference to be the Monday after Meaning Conference. Meaning conference attracting lots of people interested in meaningful business strategies, being on the Thursday, we then targeted ours to be on the Monday and Tuesday after it with a discount on the local hotels, so that if anyone wanted to go and do Thursday and then stay over and do the rest, then we were hoping that that would make it a much more saleable commodity.

You can probably see from a selection of some of the videos that are on YouTube, some of the different things that we covered there. Typically green and environmental technology, social impact technology. Alex did a fantastic talk too about algorithmic biases in machine learning, which is based on the data that you feed into these machine learning systems, the outcomes that you get from that. In Alex's specific case, his talk was about how people who had had a turbulent background or had had some trouble in their past were repeatedly called in, in the U.S. by AI systems as potential suspects for robberies, or different kinds of things that had gone on there too.

There was a massive skew between, there was a massive skew of the data that had been fed into these systems from a very non-diverse community who are always being pulled in and questioned about different activities that were happening in the U.S.

This really captures one specific type of AI and ML system, the one that is influenced based on statistics of the data bias that is fed in, not one that is a rule-based machine, this is very much one where the algorithm tailors itself based on the data fed in, but the data fed in was skewed data based on the people building those AI machines. Other talks that we had there were around encouraging kids into technology and increasing diversity into technology. We had an awful lot of talk around UX design and UX design and ethics, too.

I think this is almost a pendulum swinging back against nudge mechanics, if anyone remembers what nudge mechanics were, of how to try and gently nudge or influence people's user interaction with user interfaces based on the layout of the user interface. This was very much to make things very explicit what people are going to do and what they're subscribing to in their actions within spaces. Yes, we had several talks around UX design too and a great talk, which isn't featured on this snapshot that I've got, around the ethical OS toolkit which we'll get onto.

Points and Actions: Find your Frequency

Takeaways from the conference. This was meant to be a bit of a discovery and fact-finding mission of host a conference and see what we walk away with. Some of the takeaways that we got from this and some pointed actions off the back of it too.

Find your frequency. I do quite a lot of stuff in functional programming in the Scala community. Given that context, I hear lots of things about people's experiences of the "Echo Chamber" on Twitter, of it being quite a rowdy environment and lots of strong opinions expressed strongly too. It's not the idea of strong opinions weakly held, strong opinions strongly held and strongly expressed.

I found doing Good Tech Conf, actually subscribing to a different frequency and getting a different echo chamber together around that conference, actually was a very positive experience. A lot of the feedback that I get or a lot of the recommendations that I've got from doing Good Tech Conf are incredibly positive and affirming and point you in different directions of things that you can do.

I would suggest that if people are interested in connecting to that frequency, and seeing sustainable Tech for Good, and what things they can do off that, I generally try to lay this out as findings, some questions or actions and some links that you can do off the back of it. If you subscribe to the Good Tech Conf Twitter feed, we tend to repost lots of positive opportunities around building ethical technology or diversity in technology.

DotEveryone are a great organization who are also London-based who are trying to do a lot around ethical technology. There's Beyond Tech conference, which is coming up in May, which is very similar to Good Tech Conf but is a London-based thing. If you've not already subscribed or added your name to it, there's a link to the sustainable service poll, which is a petition trying to influence government around setting up stronger regulation about what's happening with green technology and sustainable servers, and ACM, Coed Ethics. The slides are available afterwards so I won't just read out what's right in front of you.

Analyze, Act and Iterate

Analyze, act and iterate. The idea here, find something that is on the horizon which is the big idea but actually try and act locally. Try and do things tactically and try and do small things with as big an impact as you possibly can, measure the result of that and then iterate back around that loop.

This is quite close to the idea of micro-mastery. If anyone's read the book around micro-mastery, it's find one specific technique in something that you want to master. Master that one thing which might be, say, making an omelet. Make that best omelet and don't actually, the bigger idea might be to become a chef in the future but actually, if you can master an omelet, then that gives you the high impact - it's the low hanging fruit which then stimulates you on to do the next thing and do the next thing. Trying to execute things on the short term with the long term idea goal there on the horizon, but you need those small boosts along the way to keep you going.

Choosing ethically in the ethical iOS tool kit. The links are down here, I can't recommend this link strongly enough, giveasyoulive.com. Lots of plugins for various different browsers there that effectively allows you to donate to charity every time that you buy something online. I'm yet to find vendor, a product online that I've bought that doesn't go through this site. I think it takes 1% or 2% of the payments that you make to the product owner but then gives that to charity.

If you're a busy person and want to act ethically, it's the lowest hanging fruit. You just go to the website and it just donates for you, it's fantastic. Also, things like "Ethical Consumer" will give you a rating of how ethical the different consumers are that you might be purchasing things from. In terms of trying to metricate, measure and build things that in the future stay ethical, I'd recommend the Ethical OS Toolkit of which there's a blog post by someone on the Coed Ethics website, which really, because reading the Ethical OS Toolkit is something like 70 or 80 pages worth of PDF. The three-page blog post is much easier to consume and take something from.

Diversify All-the-Things

Diversify all the things, this feeds into the idea that if we're feeding data into AI machines, which are driven based on which are data-driven machines, maybe we need to diversify that and figure out actually what are the various different unhappy paths. We spoke earlier on, about products that are built, where it's not hacking that product and getting to that data, which leads to the unhappy path, the actual product itself is built to influence you and impact you in such a way that it's actually a key component of the product to point you in a specific direction. Let alone the unhappy paths that come from that, if a product is already heading down the road to influence you in a very specific way, what happens when this go slightly out of control?

I think we'd all heard about the Microsoft AI bot, which was fed from a Twitter feed, and I think within 24 or 48 hours, either became a member of the far right or something like that, but it all went very wrong very quickly, which is the speed and scale that we're able to feed these machines with too.

In terms of diversifying all the things, lots and lots of companies say that they want to diversify their workforce. The thing is, are they actually putting measures in place to make it an attractive workplace for a diverse community or for a diverse environment? I typically do school runs every morning. If I have 9:00 meetings every morning, that will not work for me, let alone lots of other people who do those school runs too. Are things tailored in such a way to maybe have the core work hours 10:00 to 2:00 with flexible working? These are all things that actually will help diversify a workplace by having that flexibility for people's lives because not everyone is within that domain of just being able to turn up at 6:00 a.m. to 6:00 p.m. and plow through it. I can readily plow through things until 3:00 in the morning, but 9:00 a.m. meetings are not going to work.

A list of different and links or people to follow at the bottom. If you want to get involved in that, there's lots of groups like Codebar where you can turn up and mentor someone for an evening and try and help people get into technology or learn some technology. Lots of things here, Codebar, Girls Who Code, UKBlackTech, Techmums HQ, lots of things. All the things.

Get Up, Stand Up

It took all of my power not to put the Bob Marley picture up here but anyway, get up, stand up. Are the workplaces that we're in psychologically safe for us to be able to stand up and rage against the machine that we're working within? If they're not, potentially we're working in workplaces that we will soon be looking for other workplaces to work in.

If you can see or conceptualize the product that you're working on, and going into areas which you're uncomfortable with, are you working in an environment which makes it possible for you to do something about that, for you to actually stand up and make a statement about that? I think we've seen recent walkouts at Google, where workforces have banded together and rallied against senior management to try and change some of the excesses of where some of the products are going. If you're not feeling that you're working in a psychologically safe environment, I'd suggest that it's probably a good time to start looking for other places to work.

Having pride in your product. Are the things that you're building things that you feel comfortable going out and talking to people about to say, "I'm working on this thing, it is brilliant. It's awesome." Every time I meet someone, at the moment, given that I started a coffee shop roughly eight weeks ago, I bore people to tears about coffee. I see them walking away shaking their heads, but it's something I've got real pride in and really excited about. I'm hoping that people get the same sense from the stuff that they're working on, and if not, try and think of strategies to try maybe customize, tailor that to make it more in that vein.

People like Data set up things around data scientists to do cold data dives which are weekend hackathons, which they do for charity organizations, to use all of the data that those charity organizations have captured and suggest how they could optimize their businesses. They've done things like optimizing the delivery paths and how much food is distributed to different food banks based on subscribers or numbers of people who arrive at those food banks. They're trying to do optimizing for the charity sector using data that is fed in.

Also, Tech4Good meetups, Major League Hacking, people like ThinkNation. Lucy at ThinkNation is a powerhouse who is really striving forward to try and get a more diverse work environment and get young people interested in the future of technology.

Be the Hero

The final slide, I know it's cheesy, but be the hero. Not just my words, that's a whole thing that's out there. If you look at be the hero, it's a whole meme thing, it's not just me pushing be the hero. Be brave, be optimistic, try to change things as best you can. If you can't change things, maybe pivot into something else or get collaborators to try and help you push the van and try and change things with you.

I'm about to wrap up, I would say that Good Tech Conf is happening this November. On the train up here this morning, I hopefully put the CFP live for that. No speakers are listed on the website yet, but we are doing early birds and we are accepting CFPs. Please feel free to check out the site, maybe follow us on Twitter and if you want to get a quite positive and reinforced echo chamber.

Questions and Answers

Participant 1: I’m just interested in your opinions on tech and democracy. Obviously, we've seen ways in which tech can be quite damaging to democracy with things like Cambridge Analytica. Do you have any ideas on how we could actually use tech to improve our democracy?

Davies: There are movements in places like Estonia, which are trying to do things loosely based around consensus and blockchain-based voting so that people can do individual voting remotely, and trying to digitize that voting process. Though, I suspect with your mention of Cambridge Analytica, it's a slightly different pivot on that question that you're asking me in terms of how do you maybe stop the influence of all of those streams coming in.

Very difficult indeed. Other than currently trying to match the budget and effort and marketing that goes in to try and level out that skew or put regulation in place. I'm sure everyone is well aware that regulation typically lags by quite some distance. It tends to be very reactive so it tends to lag the actual marketplace, things like Cambridge Analytica, I'm sure that we'll see regulation in place to try and prevent that from happening in the future, but for that regulation to actually mature enough to tackle the core problem is probably still quite some way off. Apart from being the hero and trying to get up and stand up and make a noise about the other camper trying to normalize that out, I think it's slightly waiting for some regulations to catch up as well.

Participant 2: What's one thing we can do today besides shop via that website?

Davies: That website is the lowest hanging fruit that I have ever found which is so effective too. Other things? Without going heavy Sheila on this, of course buying tickets to Good Tech Conf and telling everyone you know. Outside of doing that, definitely subscribing to the Twitter feed for Good Tech Conf because then you can pick up on actually the specific thread of good technology that you want to get involved in. We retweet all kinds of stuff and giving links back to things too.

Things like the Coed Ethics site lists a set of resources, so that if people are interested in this kind of thing, here's a multitude of different avenues and paths that you can go down to then get satisfied depending on the type of technology that you're into.

I suppose, another finding as well is that the third sector or the social charity sectors that we talked to, their needs from technology and going digital are radically different. What they typically need is, "Can someone come in and help us upload content to WordPress? Can someone come in and help us migrate from this version of Excel to this version of Excel, because we're really anxious that we're going to lose loads of data?" There's actually such low-hanging fruit there that if someone is reasonably technically savvy could go in for an hour and would make a massive impact to those charity businesses. There's lots and lots of things like that too. They're either available through Coed Ethics or the Good Tech Conf website too.

Participant 3: I was wondering what is your familiarity or involvement with the effective altruism movement especially in the UK?

Davies: Very little, if I can get someone to talk about that at Good Tech Conf in November though, I'd be very keen on that. What’s your involvement with it? Are you involved in that movement at all?

Participant 3: I'm just personally curious.

Davies: No worries, because if you were involved in it, and you want it to talk after this, it sounds really interesting.

Participant 3: No, sorry. I don't have any expertise.

Davies: All right, not to worry. I know a tiny bit about it. I know what it is, but I've not got involved in it. Hence why we're doing another Good Tech Conf because we found out some facts, but we need to do more fact-finding too.

Participant 4: This is probably the only thing I know about it. It's the use of scientific evidence and data to maximize the amount of good that you can do in the world. Things like analyzing the effectiveness of charities, things like GiveWell, which is a website where they analyze - I suppose it depends on your philosophy, but I think sometimes that comes across as quite utilitarianist, how many lives $1 can save.

Davies: I would strongly recommend too having a look at an organization called DataKind who do a lot of stuff around predictive analytics and have done a lot of charities where they use things like Hadoop and Spark clusters to try and suggest how those organizations could use their money more effectively, how those charities could use their resources and money more effectively too, and how they could maybe streamline how some of their resources are used purely based on the data that's fed in.

Participant 4: Also, can I add? What you spoke about in the beginning about getting meaning from work, I think that overlaps with what 80,000 hours do. It's quite comprehensive, it's quite digestible. It's a website that basically says what you're saying.

Davies: 80,000 hours?

Participant 4: Yes, 80000.org. It's quite fascinating content.

Davies: Wow, ok, I'll have a look. That's new to me, so it's great that I've picked that up today. Thank you.

 

See more presentations with transcripts

 

Recorded at:

Aug 06, 2019

BT