BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Collision: Online Harassment and Machine Learning

Collision: Online Harassment and Machine Learning

Bookmarks

Online harassment is a serious issue, one that the engineers and designers behind the keyboard don't always think about when building software. Machine learning is become more prevalent but as more technology companies take advantage of it, they risk alienating their users even more by presenting content that isn't actually relevant. It's important to remember that on the other side of the cloud is a human.

At the 2016 Collision Conference, speaker Pamela Pavliscak, founder of Change Sciences, thinks that explosion of machine learning carries with it the risk that companies will end up doing a disservice to their users. She says:

People feel like they're trapped in the filter bubble -- that they can't get out, that they're trying to expand their point of view (sometimes). They want to escape this "data double". Algorithms remember things differently than I remember things. They obsessively focus on one detail about me that I might have forgotten. They remember and remember it when I'm already past it. So, the algorithm's over here and I'm over here. [Algorithms] have incomplete information about us.

Another speaker, User Researcher and User Experience Designer Caroline Sinders says that we can't wait for the machines to figure out how to keep people safe. She says:

Language is really contextual. We're creating all these new words right now that describe harassment, but are they understood? How would I explain doxxing to a machine learning algorithm?. Can design mitigate harassment on social media instead of relying on algorithms when it takes so long to implement algorithms to start to solve this problem.

InfoQ sat down with Sinders to ask what engineers and companies can do to serve the humans using their software better, but she says developers need to team up to address the harassment problem:

I think it's working really hand-in-hand with and closely with user researchers, ethnographers, and UX designers. I think the problem of harassment is something where it definitely needs to be a collaborative effort to solve. So, it's not just on engineers to solve, it's not just on UX designers to solve, it's being able to work collaboratively together using design thinking to solve complex problems inside of large scale systems. It's understanding that the solution may not be automation, it may be something else. The solution may not be machine learning, it may an implementation of better UI/UX to illustrate nuanced privacy settings and then from there, how do you then create that in a large scale infrastructure. It's not just how well the tech works, it's how well that tech is articulated to a designed front-end experience

Pavliscak asked attendees to her talk recite a pledge that they will "design algorithms with respect, collaboration, and transparency."

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT