CDAC Distinguished Speaker Series: Andrea G. Parker (Georgia Tech)
Trends & Implications of COVID-19 Information Exposure Amongst Vulnerable Populations
Abstract: Information-seeking online has become a crucial lifeline for many individuals as they search for knowledge and resources to counteract a myriad of social, health, safety, and financial COVID-19 challenges. However, research has shown that online information is less accessible to populations such as low-income adults. Furthermore, exposure to pandemic-related information may actually have negative impacts on vulnerable populations’ wellbeing, for example, through access to misinformation and content that increases anxiety amidst existing stressors and inhibits behaviors that can prevent the spread of COVID-19. As low-income, Black, and Hispanic adults are experiencing disproportionately high rates of COVID-19-related death, severe sickness and life disruptions, our research investigates to what extent COVID-19 information is supporting or inhibiting the wellbeing of these populations.
In this talk, I will discuss results from a 9-wave longitudinal survey we conducted from July 2020-January 2021 with a focus on Black, Hispanic, and low-income adults. Our findings characterize respondents’ attitudes towards COVID-19 information, levels of exposure to various kinds of COVID-19 information, and associations between information exposure and psychological wellbeing. In addition, our results characterize the evolution of these trends over several months of the pandemic. These insights serve to inform human-centered computing, data science, and public health research and practice focused on the wellbeing of vulnerable groups during public health crises.
[A virtual student roundtable with Dr. Parker for UChicago students will follow the talk at 12:15 p.m., register here.]
Bio: I am an Associate Professor in the School of Interactive Computing at Georgia Tech, and an Adjunct Associate Professor in the Department of Behavioral Sciences and Health Education, within the Rollins School of Public Health at Emory University.
I am the founder and director of the Wellness Technology Research Lab.
Previously, I was an Assistant Professor in the Khoury College of Computer Sciences and the Bouve College of Health Sciences at Northeastern University.
In 2012, I completed a postdoctoral fellowship in the Everyday Computing Lab at the Georgia Institute of Technology, where I worked with Dr. Elizabeth Mynatt and collaborated with Dr. Veda Johnson at Emory University School of Medicine. I hold a Ph.D. in Human-Centered Computing from the Georgia Institute of Technology and a B.S. in Computer Science from Northeastern University.
My research contributes to the fields of Human-Computer Interaction (HCI), Computer Supported Cooperative Work (CSCW), and Health Informatics. I design and evaluate the impact of software tools that help people manage their health and wellness. My research specifically focuses on health equity. I study racial, ethnic and economic health disparities and the social context of health management. I take an ecological approach to technology design, whereby I conduct in-depth fieldwork to examine the intrapersonal, social, cultural, and environmental factors that influence a person’s ability and desire to make healthy decisions–and how technology can support wellness in this context.
Part of the CDAC Winter 2021 Distinguished Speaker Series:
Bias Correction: Solutions for Socially Responsible Data Science
Security, privacy and bias in the context of machine learning are often treated as binary issues, where an algorithm is either biased or fair, ethical or unjust. In reality, there is a tradeoff between using technology and opening up new privacy and security risks. Researchers are developing innovative tools that navigate these tradeoffs by applying advances in machine learning to societal issues without exacerbating bias or endangering privacy and security. The CDAC Winter 2021 Distinguished Speaker Series will host interdisciplinary researchers and thinkers exploring methods and applications that protect user privacy, prevent malicious use, and avoid deepening societal inequities — while diving into the human values and decisions that underpin these approaches.