Farhana Shahid
Hi, I am Farhana from Bangladesh! I study problematic behavior in online communities, particularly in the context of the Majority World (i.e., the Global South).
I am a 3rd year PhD student in, advised by Professor . I study how the content moderation infrastructures that are in place to address problematic content impact users in the Majority World. I particularly focus on this region because people from these communities not only comprise the largest and fastest growing userbases of online platforms but also experience the worst of online harms, both as users and content moderators. I use mixed-methods to bring new insights about the impact and design of content moderation systems by integrating diverse values of the users from the margin. Through my research, I advocate for creating online spaces that care for and are responsible towards the Majority World users with whom and for whom I am working.
I completed my Bachelors and Masters in Computer Science and Engineering from, advised by Professor I also worked as a lecturer of Computer Science at and
Apart from research, I am very much drawn to books, concerts, jigsaw puzzles, long walks, and full moons.
- Apr 24: Our paper on conversational agents to address harmful content in WhatsApp groups got accepted to CSCW 2024!
- Apr 24: Our paper on content moderation in WhatsApp groups won the Best Student Paper Award at Social Media and Society in India Symposium!
- Apr 24: I gave a talk in Social Computing Research Lab at University of Michigan!
- Mar 24: Received 2 special recognitions for my reviews in CSCW 2024.
- Feb 24: This Summer I'll be interning with the Research team at Center for Democracy and Technology!
- Feb 24: I gave a guest lecture in Antisocial Computing class at UC San Diego!
- Jan 24: I gave a talk in HCI Graduate Seminar at UC Berkeley!
Publications
2024
Dhruv Agarwal, , and Aditya Vashistha. To appear in CSCW 2024.
2022
2020
, Wasifur Rahman, M. Saifur Rahman, Sharmin Akther Purabi, Ayesha Seddiqa, Moin Mostakim, Farhan Feroz, Tanjir Rashid Soron, Fahmida Hossain, Nabila Khan, Anika Binte Islam, Nipi Paul, Ehsan Hoque, and A. B. M. Alim Al Islam. In Proceedings of the ACM on Interactive Mobile, Wearable and Ubiquitous Technologies (IMWUT' 20): Vol. 4, No. 3, Article 94. [] [] []
, Shahinul Hoque Ony, Takrim Rahman Albi, Sriram Chellappan, Aditya Vashistha, and A. B. M. Alim Al Islam. In Proceedings of the ACM on Human-Computer Interaction: Vol. 4, Issue CSCW1, Article 65. [] [] []
Research
Rethinking Content Moderation Infrastructures
The existing content moderation infrastructures of Silicon Valley based social media corporations are mostly rooted in the Western values. They barely address the unique needs and circumstances of the users from the Majority World, who comprise the majority of the userbase of these platforms. This creates power imbalances as these Western corporations control the narrative of appropriate speeches online while exploiting the cheap workforce from the Global South to distill harmful content on their platforms. Due to inappropriate removal of culturally relevant content and lack of moderation tools to curb harmful content in non-Western languages, users from the Majority World often suffer more than their Western counterparts. Towards this end, my work aims to unravel the hidden hierarchies in content moderation infrastructures and promotes reimagining content moderation systems from a critical perspective.
Publications:
Misinformation in the Majority World
As the growth and popularity of many major social media platforms have saturated in the Global North, platforms are now targeting users from the Majority World. Many of these first generation Internet users are often low-literate, lack technological know-hows, and are exposed to global platforms that do not incorporate local values and sensibilities in their design. All these make them more susceptible to the harms of online misinformation and fake news. However, there is little focus on how users from the Majority World perceive and interact with different modalities (i.e., text, image, video, or audio) of misinformation. In this regard, we study what factors influence users' perceptions of misinformation and explore how to incorporate responsible design practices on social media to better help these users deal with online misinformation and fake news.
Publications: ,
PTSD within Low-Resource Communities
Critical lack of trained mental health professionals, communication and cultural barriers, social stigma around mental health in low-rosurce communities expose the traumatized individuals to greater health risks. Traditional diagnostic tools of PTSD often suffer from under-utilization due to various issues associated with human-human interactions while applying them within resource-scarce populations. In this regard, we aim to explore alternative methods to potentially screen for PTSD using various low-cost, off-the-shelf tools, e.g., portable EEG headsets and free-hand sketches using simple pencil and paper. Findings from our field work involving refugees and slum-dwellers enable us to identify scopes for improvement in screening the potential cases of PTSD within low-rosurce communities.
Publications: ,
Teaching
Appointments
Graduate Teaching Assistant
Lecturer
Lecturer
Courses Taught
- Computer Graphics (Spring 2021, Fall 2020, Summer 2020, Spring 2020, Fall 2019, Summer 2019, Spring 2019, Spring 2018)
- Pattern Recognition (Fall 2020, Summer 2020)
- Artificial Intelligence (Spring 2020, Fall 2019)
- Object Oriented Programming (Fall 2019, Summer 2019, Spring 2019, Fall 2018, Summer 2018)
- Digital Logic Design (Spring 2020, Summer 2018)
- Algorithms (Summer 2019, Fall 2019, Summer 2018)
- Data Structures (Summer 2019, Summer 2018)
- Operating Systems (Spring 2018, Fall 2017)
- Mathematical Analysis for Computer Science (Fall 2017)
Courses Assisted
- Technology for Underserved Communities (Spring 2023)
- Computing and Global Development (Spring 2024, Fall 2022, Fall 2021)