The inner workings of technology may be a mystery to some... Likely a mystery to many. And that can lead us to believe it's invincible. There's no way a computer could be biased, right? It doesn't have eyes or feelings or a brain. It's all just math! (Well, there's artificial intelligence... But that's a different story!)
But that could not be more wrong. What one must consider is not the computer's bias, but those of who programs the computer. The groundwork had to be laid somewhere, somehow, by someone. Everyone has a bias, some more than others, and these get built into the computer systems that we so regularly use and deeply rely on in 2020 (think Google Maps, Apple app store suggestions, Facebook news, Instagram ads, and more). Hao and Stray provide a great illustration of the biases involved in the computerized system courts use to help decide whether an accused party gets bail. Though they do not technically take race into account, the decisions are still disproportionately racist against people of color. O'Neil paints a sad, yet very real picture of a talented teacher who was wronged by the computer system used for student testing and teacher evaluations. A school lost out on an impactful educator and an educator lost a pay check due to a score that was "too complicated" to explain in words. As I read about some of the social issues in computing this week, I was both amazed and horrified at the impact that tech companies can have on my life and my access to information. As cited by O'Neil (2016) in Weapons of Math Destruction, Karrie Karahalios, a researcher from University of Illinois, found that 62% of people were unaware that Facebook tinkered with the system and didn't realize that they did not simply see the posts that their friends made on their timelines. While I'm not quite that naïve, cue "Instagram algorithm" being a buzzword this year, the examples that O'Neil illustrates in Weapons of Math Destruction made me realize that there is so much more happening than I fathomed. One example that stuck with me was Facebook researching user moods and how the statuses seen can affect those others post. A bit scary to consider how a social media company can purposely toy with emotions. These realizations beg a consideration of how I as an educator can help teach my students to be wise users of technology and to be critical of the equity issues brought up. In my fourth grade classroom. this starts with lessons on digital citizenship and an understanding of how words (spoken or typed) have an effect on those around them. Using a critical lens, we can look at biases from authors and practice identifying real news instead of taking everything at face value. Though it seems high level for nine and ten year old kids, they're using technology each day. I think it vital to equip them to interact with it instead of making fruitless attempts to keep them from using YouTube, Google, etc. Elementary students in my experience have a deep desire for things to be fair, and this is such an important topic to tap into at a young age. What's fair for technology to do without us knowing? Who can fairly use technology? When is it fair to use a computer to make a decision? All questions our future generations will deal with and ones educators can [hopefully] help them answer wisely. Hao, K. & Stray, J. 2019, October 17. Can you make AI fairer than a judge? Play our courtroom algorithm game. MIT Technology Review. https://www.technologyreview.com/2019/10/17/75285/ai-fairer-than-judge-criminal-risk-assessment-algorithm O'Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Basic Books
1 Comment
Katie Rich
11/23/2020 07:57:44 am
Hi Sarah,
Reply
Leave a Reply. |
Hi there!I'm Sarah! I have a passion for powerful teaching and lifelong learning. I am a 4th grade teacher turned instructional designer, and this is my blog documenting the journey. Click here to get in touch with me. Archives
February 2021
|