Online personalization algorithms are leading many content viewers to narrower choices.

If you’re on social media, you’re probably experiencing an online echo chamber. You’re seeing content and opinions primarily posted by friends and by the media outlets you follow. Like a real-world echo chamber, these messages bounce ideas and information to you that are similar to the ones you put out into the world.

While it’s commonplace to connect with like-minded people in social settings, echo chambers are exacerbated as more social interactions shift online. You may realize that your feeds are biased, but the impact of personalization algorithms — used, for instance, by Facebook — can be subtle and multiplicative. These algorithms lead to even more personalization over time. The chambers become deeper as the algorithms and the choices users make about what to read and watch mutually reinforce each other. The trend makes it harder for dissenting views or entirely new information to permeate our online worlds, and it reinforces our own beliefs and information sources.

Many businesses conduct marketing activities and develop their brands in online environments that highly manage the content that users encounter. Executives may not be aware of the extent that algorithms are influencing this experience. Understanding the context in which messages are being delivered is a critical first step. More so, our research shows that there are ways that the echo chamber itself might be managed.

The Interests of the Crowd vs. the Interests of the Individual

For our research, we examined two things: the types of users most likely to get caught in content echo chambers, and the role of the content’s popularity — such as “like” counts and view counts. We created a simple online environment for content exploration that broke down search into two dimensions — content topic and content popularity. We then looked at the ways that individuals moved through the material. We observed, for instance, the weight that users placed on their own interests versus those of the crowd, and how these patterns relate to an individual’s characteristics, such as how social they regard themselves and whether they try to influence others in their social circles (what we term “opinion leaders”).

Our assumption was that users who conduct little exploration and rely more heavily on the crowd will, over time, see less diverse content. As a result, they’ll be at a higher risk of getting caught in an echo chamber.

In our experimental search environment named TED-it, 1,846 study participants explored the collection of TED talks posted on YouTube (roughly 1,600 short videos).1 Participants navigated using two buttons, Category and Popularity. The Category button allowed users to choose one of 15 content groupings and presented a list of talks in random order, without any ranking. By contrast, the Popularity button sorted the displayed search results by their number of views on YouTube — from most to least popular — or simply sorted all talks by popularity if no category was chosen. Users could click each of the buttons as many times as they like, creating a search sequence. This search journey was the object of our study.

The relationship between search patterns and viewers’ social characteristics was further determined by a series of questionnaires, which assessed users’ sociability, opinion leadership, and previous experience with TED content, along with some demographics.

As with most content platforms, there are popular TED talks and many less-viewed options. We found that generally, people who explored content with less reliance on its popularity metrics ended up at content that is less known by their peers, but more suited to their personal interests. As assumed, users who consistently search by their topics of interest and rely less on the crowd’s previous choices were less likely to suffer from echo chambers. On the flip side, people who were susceptible to echo chambers in our research setting had a strong reliance on Popularity sorting and a low utilization of Category search.

Furthermore, we found that three types of people were most at risk for falling into echo chambers: highly social individuals, those already familiar with TED content, and young users. They were all more likely to rely on popularity considerations and explore less. They also were less likely to choose from unsorted topical results.

People who consider themselves opinion leaders were more likely to explore the videos and had lower reliance on the content’s popularity. We found that male opinion leaders in our sample conducted more topic-based exploration and invested more effort in search than male non-leaders. Apparently, these opinion leaders are more likely to seek new avenues for influence, and look for novel content to introduce to their followers that may ameliorate their group’s own echo chamber.2

We also studied how influenced people are by knowing how popular a piece of content is. With “like” counts and view counts now baked into most online environments, was this information making people more likely or less likely to explore unfamiliar territories?

To answer this question, we randomly assigned users to one of two conditions, where popularity information was either shown or blocked in category-based searches. Interestingly, in our study, only male opinion leaders were affected by the display of view counts.

The Case for Not Showing “Like” Counts

The findings present thought-provoking implications and considerations for content providers.

For instance, should media managers suppress, or at least reduce, the visibility of content popularity information to increase the diversity of content consumed by site visitors? If they want consumers to venture down the long tail of content, the answer is yes. Reducing the visibility of like counts and view counts may help mitigate echo-chamber effects.

Taking popularity information away might seem like a radical experiment, but in our analyses, there were no statistically significant correlations between users’ exploration characteristics and their enjoyment of the content. This suggests that at least in the realm of curated content, such as TED talks, popular content is not inherently superior to less popular content. Therefore, designing online environments that encourage topic-based exploration, that give increased visibility to less-viewed content, and that have reduced visibility of view counts could alleviate content echo chambers with little impact on views and user satisfaction. In such environments, managers targeting social media messaging to opinion leaders are more likely to break the established patterns of those leaders and increase the reach of their organization’s message.

Business leaders should be aware of echo-chamber dynamics and the human factors feeding into them. Our findings can help identify individuals who are seeing more or less diverse content, which can help shape marketing strategies in online environments that are increasingly dominated by algorithms.


1.TED is a nonprofit organization devoted to spreading ideas, usually in the form of short talks (18 minutes or less). TED stands for Technology, Entertainment and Design, though TED talks today may cover any topic (more at We use TED talks in compliance with their Creative Commons license.

2.There were no statistically significant patterns for females in our sample.