CHI 2018 papers.

Anja Thieme, Cyn­thia L. Ben­nett, Ceci­ly Mor­ri­son, Edward Cutrell and Alex Tay­lor (2018) “I can do every­thing but see!” – How Peo­ple with Vision Impair­ments Nego­ti­ate their Abil­i­ties in Social Con­texts. In Pro­ceed­ings CHI ’18. ACM Press. 

Ari Schlesinger, Ken­ton O’Hara and Alex Tay­lor (2018) Lets Talk about Race: Iden­ti­ty, Chat­bots, and AI. In Pro­ceed­ings CHI ’18. ACM Press. 

Very hap­py to have con­tributed to two papers being pre­sent­ed at the upcom­ing CHI con­fer­ence this year. One reports on work with the blind and vision impaired a few of us have been involved in dif­fer­ent ways (see here). Broad­ly, we’ve used the piece to reflect on the rela­tions between vision impair­ment and arti­fi­cial intel­li­gence, and set out direc­tions for a pos­si­ble design space.

The sec­ond paper picks up on a new theme for me, but one close­ly relat­ed to past reflec­tions and design work around machine intel­li­gence. With the fan­tas­tic Ari Schlesinger (GA Tech) lead­ing the research, we exam­ine the chal­lenges faced in han­dling race talk (and racism) in human-bot inter­ac­tions. Tak­ing both Tai AI and the black­list as start­ing points, we take seri­ous­ly the com­pu­ta­tion­al under­pin­nings of chat bots and con­ver­sa­tion­al agents, to under­score the role they have in sus­tain­ing trou­bling racial cat­e­gories and the con­di­tions they make pos­si­ble for more just and equi­table ways forward.

Abstract — This research takes an ori­en­ta­tion to visu­al impair­ment (VI) that does not regard it as fixed or deter­mined alone in or through the body. Instead, we con­sid­er (dis)ability as pro­duced through inter­ac­tions with the envi­ron­ment and con­fig­ured by the peo­ple and tech­nol­o­gy with­in it. Specif­i­cal­ly, we explore how abil­i­ties become nego­ti­at­ed through video ethnog­ra­phy with six VI ath­letes and spec­ta­tors dur­ing the Rio 2016 Par­a­lympics. We use gen­er­at­ed in-depth exam­ples to iden­ti­fy how tech­nol­o­gy can be a mean­ing­ful part of abil­i­ty nego­ti­a­tions, empha­siz­ing how these embed into the social inter­ac­tions and lives of peo­ple with VI. In con­trast to treat­ing tech­nol­o­gy as a solu­tion to a ‘sen­so­ry deficit’, we under­stand it to sup­port the tri­an­gu­la­tion process of sense-mak­ing through pro­vi­sion of appro­pri­ate addi­tion­al infor­ma­tion. Fur­ther, we sug­gest that tech­nol­o­gy should not try and replace human assis­tance, but instead enable peo­ple with VI to bet­ter iden­ti­fy and inter­act with oth­er peo­ple in-situ.
Abstract — Why is it so hard for chat­bots to talk about race? This work explores how the biased con­tents of data­bas­es, the syn­tac­tic focus of nat­ur­al lan­guage pro­cess­ing, and the opaque nature of deep learn­ing algo­rithms cause chat­bots dif­fi­cul­ty in han­dling race-talk. In each of these areas, the ten­sions between race and chat­bots cre­ate new oppor­tu­ni­ties for peo­ple and machines. By mak­ing the abstract and dis­parate qual­i­ties of this prob­lem space tan­gi­ble, we can devel­op chat­bots that are more capa­ble of han­dling race-talk in its many forms. Our goal is to pro­vide the HCI com­mu­ni­ty with ways to begin address­ing the ques­tion, how can chat­bots han­dle race-talk in new and improved ways?

Leave a Reply

Your email address will not be published.