May 20, 2024

A Greater Understanding of Race and Identity Through Tech

But that was five years ago. Today, outrage feels like the only thing on social media. It seems one of the easiest ways to build followers is to be consistently angry about something, whether it’s racism, politics or food. And the more extreme the view, the better. In many ways, social media has become a hotbed for radicalization and misinformation and bigotry.

I believe social media helped lift the Black Lives Matter movement and raise our awareness of police brutality, but I also believe it helped the white nationalists who protested in Charlottesville, Va., in 2017. In my view, these are manifestations of the same phenomenon.

Technology itself has been famously ham-handed at dealing with race. There are questions about whether algorithms exacerbate bias, about why facial recognition makes more errors when trying to identify dark-skinned people.

Racism in technology is a serious problem, and I know there are many people who are much smarter than I am who are thinking about how to solve it.

In 2014, Stephen Hawking wrote a piece that I will never forget. He warned that artificial intelligence was advancing rapidly and that we needed to be honest with ourselves about whether or not we were prepared for what that meant. We are feeding machines with information everyday, and the information we choose to provide says a lot about who we are — our racism, our values, our strengths, our weaknesses.

In other words, when facial recognition software makes an error when identifying a person with dark skin, I believe that says more about us than it does about what the algorithm is capable of doing.

I think Mr. Hawking’s point was that you don’t have to spend all your time reading about neural networks and deep machine learning and A.I. and data streams to be concerned about this. Ultimately I think he was arguing this is a human rights issue. Interestingly, his article was published the same year that Facebook apologized for running psychological experiments on users without their knowledge.

Article source: https://www.nytimes.com/2019/10/30/technology/personaltech/lauretta-charlton-race-related.html?emc=rss&partner=rss

Speak Your Mind