BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Podcasts Brittany Postnikoff on Security, Privacy, and Social Engineering with Robots

Brittany Postnikoff on Security, Privacy, and Social Engineering with Robots

Bookmarks

In this podcast, Daniel Bryant sat down with Brittany Postnikoff, a computer systems analyst specialising on the topics of robotics, embedded systems, and human-robot interaction. Topics discussed included: the rise of robotics and human-robot interaction within modern life, the security and privacy risks of robots used within this context, and the potential for robots to be used to socially engineer people.

Key Takeaways

  • Physical robots are becoming increasingly common in everyday life, for example, offering directions in airports, cleaning the floor in peoples’ homes, and acting as toys for children. 
  • People often imbue these robots with human qualities, and they trust the authority granted to a robot.
  • Social engineering can involve the psychological manipulation of people into performing actions or divulging confidential information. This can be stereotyped by the traditional “con”.
  • As people are interacting with robots in a more human-like way, this can mean that robots can be used for social engineering.
  • A key takeaway for creators of robots and the associated software is the need to develop a deeper awareness of security and privacy issues. 
  • Software included within robots should be patched to the latest version, and any data that is being stored or transmitted should be encrypted.
  • Creators should also take care when thinking about the human-robot UX, and explore the potential for unintended consequences if the robot is co-opted into doing bad things.

Show Notes

Could you introduce yourself?

  • 01:05 My name is Brittany Postnikoff, or Straithe on Twitter [https://twitter.com/Straithe]
  • 01:10 My current work is researching whether robots can social-engineer people, how they can do it, and what sort of defences we can put in place against those attacks.

Could you provide an overview of your "Robot social engineering" QCon New York 2019 talk?

  • 01:35 Some of the key themes of the talk include talking about how people interact with robots on a social level.
  • 01:45 A lot of the research that has been done on human-robots interactions are things like robots holding authority over people, especially if the robot looks to have been given authority.
  • 02:05 For example, when we do experiments, there's usually a researcher in the room, and you generally trust the researcher.
  • 02:15 What would happen is the researcher that would explain that the robot is going to say how the work needs to be performed, and the participant needs to listen.
  • 02:30 In that case, the researcher is delegating authority to the robot, and people will interact with it appropriately.
  • 02:40 I also talked about empathy for robots, whether they can bribe humans and similar experiments.
  • 02:55 I introduced social engineering, and talked about robot social engineering attacks that I can see happening soon.

Robots are becoming prevalent within our society - can you give any examples?

  • 03:10 The robots in my talk are very much physical robots - nothing like a Twitter bot that you might interact with.
  • 03:20 Roombas are usually the base case when I'm talking about robot social engineering.
  • 03:25 There are also robots in airports that help navigating between gates.
  • 03:40 There are robots in stores helping to sell things; one called Pepper that helps sells cellphones in malls.

Can you share how security, privacy and ethics differ in the domain of robotics?

  • 04:00 One of my favourite phrases: bugs and vulnerabilities become walking, talking vulnerabilities.
  • 04:15 The aspect of physical embodiment that makes the domain of robotics interesting.
  • 04:25 If you have a camera or microphone in one of your rooms, and you want to have a private conversation that's not listened to by Amazon or Google, you might go to another room.
  • 04:40 If you have a robot, and a malicious actor has worked their way into that robot, then it could follow you to a new room - all of a sudden there's new privacy and security issues.
  • 04:50 If a malicious person can get into your robot, they have eyes and ears into your home in places that they shouldn't.
  • 05:00 Having a camera that can move into a child's room or bedroom - you might not notice it if it's your robot and you're used to it moving around all the time.
  • 05:10 People don't understand that when a malicious actor takes over your robot, it's usually indistinguishable from when the robot is acting on its own.

How can we encourage engineers to think about security and privacy when they are designing these kinds of systems?

  • 05:30 I think a big part of it is awareness, which is why I like giving this talk and exposing people to this topic.
  • 05:40 If you don't know that certain attacks can happen, or certain design decisions make these kind of attacks more likely, why would you defend against it?
  • 05:50 Building a culture of more security and privacy is something I try and do with these talks.

Are blackbox ML and AI algorithms helping or hindering what's going on in robotics?

  • 06:20 My research typically avoids things ML and AI, because when you do social robotics work, there's a concept called Wizard-of-Oz-ing.
  • 06:30 It's like in the movie, when you don't know until the last moment that (spoiler alert) there's a man behind the curtain controlling everything.
  • 06:40 We use that in research as well, because it's been shown that people can't tell the difference between an autonomous robot that is acting on its own, and one that is being controlled by someone.
  • 06:55 The important part of my research is the physicality of the robots, so how they gesture or look at people.
  • 07:05 If it's two feet tall, do you interact with it differently if it's six feet tall?
  • 07:10 So we spoof the ML and AI because results show they'll interact with it the same way whether someone is controlling it or not if they perceive the robot to be capable of interacting on its own.

Can you introduce the topics of social engineering?

  • 07:40 It is a term that is used by the security community, but other terms that people might be familiar with are things like scams, conmen etc.
  • 07:55 One of my favourite examples is from the early 2000s, when there was a gentleman who bought large amounts of cheap wine and rebottled it as more expensive types of wine (or relabelled it).
  • 08:30 He did the rebottling, but it wasn't enough - he had to convince people that the wine he had was worth buying.
  • 08:45 He had tailor-made suits and fancy cars - he lived the lifestyle of someone who would own expensive bottles of wine - and had a background story prepared.
  • 09:10 All that blends into one technique called 'pretexting' - there's information gathering, knowing what you want to talk about, and why.
  • 09:30 A lot of it is interpersonal playing on people's cognitive dissonance between what they expect and what is real.
  • 09:40 People often suspend their disbelief if people are believable.

How can robots be used for social engineering?

  • 09:50 Robots have different social abilities that they can use; empathy, authority, bribing, building a trust relationship, and these are a lot of the same things used in social engineering.
  • 10:05 Robots can use these techniques is the same way.
  • 10:10 We had robots build rapport with people, build a story about how the robot really liked working in the lab but was feeling sick lately and have a virus.
  • 10:30 If you watch the video you can see the participants having empathy towards the robot and how bad they felt for it when it was sick.
  • 10:40 The robot would say things like: "the experimenters are going to come and reset me - oh no!" and because it was an experiment, the researchers would do that.
  • 10:50 People were visibility upset when that happen.
  • 10:55 This is the kind of scammers rely on when there's been a big disaster; con people will be asking for donations to this charity or saying that they've lost their home.
  • 11:05 There have been experiments where robots have been pan-handling to get money - and people were thinking that the robot needs a home too.
  • 11:15 It's amazing how similar we are treating robots who are able to move as similar to humans.
  • 11:20 My goal for the next few years is to try and recreate social engineering experiments but with robots instead.

How do engineers in general mitigate some of the risks you have mentioned?

  • 11:50 You can put security on the robots.
  • 11:55 A lot of the robots I have interacted with have had dismal security.
  • 12:05 The server software is from 2008, a beta version, has had no updates - and there's dozens of open CVEs on it - and they are broken when you buy the robot.
  • 12:20 That's something to think about - make sure you can update the software, and someone is checking the security of robots in their home.
  • 12:35 One robot I was playing with has a very usable portal when you go to the robot's web server.
  • 12:45 A lot of the robots have their own servers on them.
  • 12:50 You can log in, but a lot of people don't change the default password, so it is easy to get in.
  • 12:55 It was easy for me to get in and make the robot do dangerous things.
  • 13:00 It's important to let users know or help them do proper set up.
  • 13:10 Engineers could develop processes for people to do the proper set up and change passwords.

What can I do, as a user, to educate myself on these risks?

  • 13:30 Awareness is a good place to start; For example, I like to put my Roomba in a closet when it's done.
  • 13:35 That way, if someone was able to get control of it, they couldn't do much because it was in a closet.
  • 13:40 A lot of Roomba-like robots are now being sold as home guards, and they have 1080 HD cameras on them, and see if anyone is home or when they were there last.
  • 14:00 If you have vacations on public calendars in your home, and a controlled robot can see that - it's a great time to case your home.
  • 14:15 By putting the robot in a closet, at least you are protecting yourself from some of those things.
  • 14:20 It's always a good step to reset passwords for all of your devices when you get something new.
  • 14:25 Trying to run updates whenever you can is also a good idea.
  • 14:30 Specifically when it comes to robot social engineering, if the robot is in your space, there are things you can do to contain it.
  • 14:40 If the robot is in a public space, it's not great to try and contain it because you can get in trouble for messing with someone else's property.
  • 14:50 If you have robots wandering around taking pictures of people's faces or number-plates, people can be uncomfortable with that.
  • 15:00 Having a way to obscure your face can be useful too.

What is the state of interaction between academia and industry?

  • 15:15 As far as I know, I am one of two people in the world looking at this topic, and the first to actively write about joining social engineering and robots together that I can find.
  • 15:30 There are small overlaps with other places because this is so inter-disciplinary; academia and industry are doing well on collaboration and interacting with robots.
  • 15:45 Interaction between robots has been adopted by industry and is being researched by academia - that space is going well.
  • 15:50 Security and robots in general are going quite well - companies are showing up and presenting at conferences, so there is overlap between those spaces.
  • 16:05 I haven't seen as much overlap between academia and industry; I've tried to talk to some companies, and they're not as concerned with things that may happen.
  • 16:15 I think it will take an act of attack for things to happen before people start paying attention.
  • 16:25 When it comes to robot social engineering specifically, it's a very new space and I'm looking forward to seeing what will happen.
  • 17:05 Because my topic is so inter-disciplinary, there are different groups who have different thoughts about how it should be done.
  • 17:15 I feel that it makes my research stronger, because there are so many differing opinions and lack of understanding between groups.
  • 17:25 I see my research as being an opportunity to be able to bring multiple groups together and give a nexus to talk in social situations that we otherwise might not have.

What do you think the future is for robotics and software engineering from a sci-fi perspective?

  • 18:00 I don't think it has to be either software or hardware driven.
  • 18:05 That's one thing that usually gets me in sci-fi; there's usually only focus on one technology where multiple vulnerabilities probably exist in the same space.
  • 18:15 Star wars does it well; you have multiple different types of robots, throughout the whole franchise, who do a variety of different things.
  • 18:25 If you look at C3P0, he's pretty terrible at walking but is very smart.
  • 18:30 Then you have R2D2, who doesn't talk much - there's some inflection in the voice which gives personality but his communication skills aren't on the same par as C3P0.
  • 18:45 R2D2 is great for flying around in space, using tools specifically.
  • 18:55 Then you have some of the newer robots which are fantastic ninja fighters, which are obviously very good at movement, so I think Star Wars has that flexibility.
  • 19:05 At the same time, there's so many different formats for sharing information and processing data, in the Star Wars universe too.
  • 19:15 When you think about Matrix, Snowcrash, Westworld, iRobot - there's probably examples from each one of those things that we're going to go towards in the future.
  • 19:30 That's one way that industry affects academia is that there's so much effort people put in to create their robots that they grew up with in movies and make them real.
  • 19:45 Star Trek has inspired so much technology since its initial release; it's a feedback loop, for sure, where our imagination inspires what's happening in tech.

What's the best way to learn more about robotics or your topic?

  • 20:15 A lot of what I do is social robots, and less about robotics - if you had me in a room with a lot of roboticists, we wouldn't have much to talk about other than design of outward appearance.
  • 20:30 For hard robotics, it's a lot of maths - so taking a lot of maths classes; learning how to use ROS, the Robot Operating System; getting experienced with hardware boards.
  • 20:45 There's a lot of books by "No Starch Press" which would help there.
  • 20:50 Going to conferences and attending different workshops is helpful too - there's a few conferences I go to that have workshops that bring you up to base level.
  • 21:00 For university courses, trying a human computer interaction course is very helpful because it teaches you how people interact with machines like phones.
  • 21:15 A lot of those principles can also be applied to robots.
  • 21:20 I would recommend a robotics course if you wanted to get into physical and electrical robot design.
  • 21:35 For human computer interaction there's only a few schools that offer those courses, so choose one of them.
  • 21:40 I always recommend an ethics course - take as many of these as you can.

What's the best way to follow your work?

  • 22:00 Twitter is definitely the best way; it's the social media platform that I'm often on - https://twitter.com/Straithe
  • 22:10 I also have my own website https://straithe.com

Recommended next

Video (includes full transcript): Robot Social Engineering: Social Engineering Using Physical Robots

More about our podcasts

You can keep up-to-date with the podcasts via our RSS Feed, and they are available via SoundCloud, Apple Podcasts, Spotify, Overcast and the Google Podcast. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Previous podcasts

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT