Rank by Reaction
Posts with strong likes, comments, and watch time get pushed to the top, even if they’re oversimplified, emotional, or misleading.

SOCIAL MEDIA EFFECTS
From algorithms to echo chambers, this page walks through how platforms filter information long before we ever decide to vote. All of the ideas here are connected to the studies and reports summarized on the Data & Evidence page.
Platforms rank posts based on engagement, not accuracy or balance. That changes what shows up at the top of your feed and which political ideas feel “normal.”
Posts with strong likes, comments, and watch time get pushed to the top, even if they’re oversimplified, emotional, or misleading.
The system quietly tracks what you pause on, share, or save and learns which topics keep you scrolling the longest.
Over time, your feed shrinks to a small slice of everything online, making some viewpoints feel “normal” and others almost invisible.
Many people mostly see viewpoints that fit their existing beliefs. Over time, that can make disagreement feel rare, extreme, or even dangerous — especially when every screen looks the same.
Feeds are curated — you never see “everything.”
Algorithms reward strong reactions and watch time, not accuracy or fairness.
Echo chambers can make extreme views feel more common and more “normal” than they really are.
Knowing how your feed works helps you be an intentional voter instead of a passive scroller.
Now that you’ve seen how feeds work, you can dig into the research behind these ideas or turn what you learned into an actual voting plan.
Visit the Data & Evidence page for short, readable breakdowns of the main studies and real-world cases behind this project.
Go to Data & Evidence →Use the Take Action page to slow down your feed, double-check information, and make a real voting plan.
Go to Take Action →