I am an assistant professor in the Information School at University of Washington. My research interests are in Social Computing, where I combine ideas from both computer science and social science to uncover insights about social life online via large datasets. Currently, one major focus of my research is understanding and designing defenses against problematic information in online social platforms. My work employs a range of interdisciplinary methods from the fields of human computer interaction, data mining, machine learning, and natural language processing.
From August 2017 to 2020, I was an assistant professor in the Department of Computer Science at Virginia Tech. Before that I got my PhD in Computer Science from Georgia Tech. Currently, I am also an adjunct affiliate of UW CSE, an affiliate faculty of the Center for an Informed Public and a co-founding director of RAISE, a Center for Responsibility in AI Systems and Experiences.
Current Research Topics
Auditing and Contesting Algorithmic DecisionsThere is a growing concern that problematic content online are often amplified by the very algorithms driving the online platforms, be it Twitter’s trending topic or YouTube’s video recommendation algorithm. Yet, their opacity makes it impossible to determine how and when algorithms are amplifying such content. How do we advance responsibility in algorithm design? My lab is conducting computational audits to determine the adverse effects of algorithms. We have conducted the first systematic misinformation audit on YouTube to empirically establish the “misinformation filter bubble effect”, followed by auditing an e-commerce platform to reveal how algorithms amplify vaccine misinformation. We are also interested in questions around contestability in large scale online systems.
Understanding Problematic PhenomenaToday, online social systems have become integral to our daily lives. Yet, these systems are now rife with problematic content, whether they be harmful misinformation, damaging conspiracy theories, or violent extremist propaganda. Left unchecked, these problems can negatively impact our democracy and society at large. My lab has been investigating what makes people join online conspiratorial communities, how dramatic events effect conspiratorial discussions, what are the narrative motifs of these discussions. We have also looked at online extremism and have answered questions ranging from how hate groups frame their hateful agenda to what roles they play.
Designing to Defend Against Problematic InformationOur lab is also interested in designing systems to counter problematic information. Checkout our work on OtherTube and NudgeCred. NudgeCred is a socio-technical intervention powered by the idea of nudges, choice preserving architectures that steer people in a particular direction, while also allowing them to go their own way. We have just started delving into questions around designing trustworthy journalism. How can news organizations effectively demonstrate to the public the key aspects of good journalism, the primary features that make a story trustworthy, and the core aspects that govern the production and reporting of a news story?
Understanding Misinformation in the Global SouthMisinformation research have primarily focused on the Global North, while ignoring the rest of the world. In the next several years, we strive to go beyond the current US/Euro-centric focus on misinformation research. Most of this work will be pursued through two grant initiatives: the Fact-Checking Innovation grant which has enabled working with fact-checkers from Kenya and ONR-YIP early career award which seeks to obtain a holistic understanding of adversarial online influence in the Indian Ocean Region (IOR).
Recent Publications (all)
Human and Technological Infrastructures of Fact-checking
P. Juneja, T. Mitra | CSCW 2022 | paper | doi
Pathways through Conspiracy: The Evolution of Conspiracy Radicalization through Engagement in Online Conspiracy Discussions
S. Phadke, M. Samory, T. Mitra | ICWSM 2022 | paper | doi | Best Paper Award
OtherTube: Facilitating Content Discovery and Reflection by Exchanging YouTube Recommendations with Strangers
M. Bhuiyan, C. Isaza, S. Lee, T. Mitra | CHI 2022 | paper | doi
Characterizing Social Movement Narratives in Online Communities: The 2021 Cuban Protests on Reddit
B. Keith, T. Mitra, C. North | Computational Journalism 2022 | paper | doi
Design Guidelines for Narrative Maps in Sensemaking Tasks
B. Keith, T. Mitra, C. North | Journal of Information Visualization 2022 | paper | doi