Research

Penn Media Accountability Project (PennMAP)

PennMAP is building technology to detect patterns of bias and misinformation in media from across the political spectrum and spanning television, ratio, social media, and the broader web. We will also track consumption of information via television, desktop computers, and mobile devices, as well as its effects on individual and collective beliefs and understanding. In collaboration with our data partners, we are also building a scaleable data infrastructure to ingest, process, and analyze tens of terabytes of television, radio, and web content, as well as representative panels of roughly 100,000 media consumers over several years.

ben franklin w/mask

COVID – Philadelphia

Our team is building a collection of interactive data dashboards that visually summarize human mobility patterns over time and space for a collection of cities, starting with Philadelphia, along with highlighting potentially relevant demographic correlates. We are estimating a series of statistical models to identify correlations between demographic and human mobility data (e.g. does age, race, gender, income level predict social distancing metrics?) and are using mobility and demographic data to train epidemiological models designed to predict the impact of policies around reopening and vaccination.

Remote Meeting

High-Throughput Experiments on Group Dynamics

To achieve replicable, generalizable, scalable, and ultimately useful social science, we believe that is necessary to rethink the fundamental “one at a time” paradigm of experimental social and behavioral science. In its place we intend to design and run “high-throughput” experiments that are radically different in scale and scope from the traditional model. This approach opens the door to new experimental insights, as well as new approaches to theory building.

Common Sense

This project tackles the definitional conundrum of common sense head-on via a massive online survey experiment. Participants are asked to rate thousands of statements, spanning a wide range of knowledge domains, in terms of both their own agreement with the statement and their belief about the agreement of others. Our team has developed novel methods to extract statements from several diverse sources, including appearances in mass media, non-fiction books, and political campaign emails, as well as statements elicited from human respondents and generated by AI systems. We have also developed new taxonomies to classify statements by domain and type.

News

Joe Biden’s (but not Donald Trump’s) age: A case study in the New York Times’ inconsistent narrative selection and framing

On the weekend of March 2-3, 2024, the landing page of the New York Times was dominated by coverage of their poll showing voter concern over President Biden’s age. There was a lot of concern among Democrats about the methods of the poll, especially around the low response rate and leading questions. But as a team of researchers who study both survey methods and mainstream media, we are not surprised that people are telling pollsters they are worried about Biden’s age. Why wouldn’t they? The mainstream media has been telling them to be worried about precisely this issue for months.

Hyperpartisan consumption on YouTube is shaped more by user preferences than the algorithm

Given the sheer amount of content produced every day on a platform as large as YouTube, which hosts over 14 billion videos, the need for some sort of algorithmic curation is inevitable. As YouTube has attracted millions of views on partisan videos of a conspiratorial or radical nature, observers speculate that the platform’s algorithm unintentionally radicalizes its users by recommending hyperpartisan content based on their viewing history.

But is the algorithm the primary force driving these consumption patterns, or is something else at play?

The YouTube Algorithm Isn’t Radicalizing People

About a quarter of Americans get their news on YouTube. With its billions of users and hours upon hours of content, YouTube is one the largest online media platforms in the world.

In recent years, there has been a popular narrative in the media that videos from highly partisan, conspiracy theory-driven YouTube channels radicalize young Americans and that YouTube’s recommendation algorithm leads users down a path of increasingly radical content.