Portfolio/Facebook News Bias
Balancing News Bias On Facebook
Design a system that allows Facebook users to balance their news bias as well as to see the bias rating for their current and upcoming news consumption.
What are echo-chambers and filter bubbles, and how it leads to news bias?
When you like to read certain types of news, whether it's conservative or liberal news, that becomes your comfort zone, and usually, you don't wish to get out of it. You share this news with your friends' circle and with your followers on social media. Most of the people you follow will like your shared news post because that is your echo-chamber or filter bubble means having people with similar interests. These people will also share the news that you want, and that leads to News bias.
But you should know that there is another side to almost every news and there are other people with different news preferences. (Multiple Echo-Chambers and Filter Bubbles)
Why Facebook app?
Facebook is a widely used social media platform in the entire world. Echo-chambers and filter bubbles often form on Facebook because the algorithm that underpins the platform reinforces the user's confirmation bias by continually recommending news and information that aligns with their past activities.
What is the problem statement
How might we give Facebook users exposure to news and information outside of their echo-chambers and filter bubbles?
What is a solution?
Design a system that allows Facebook users to balance their news bias as well as to see the bias rating for their current and upcoming news consumption. This solution is integrated into the Facebook itself and consist of two main features:
Media Balance Scale
Media Bias rating system by allsides.com to show what types of news they are reading. (L=Left, L=Lean Left, C=Center, R=Lean Right, R=Right)
The dashboard for showing overall ratings on news that users have consumed so far. It allows users balancing news bias using the media balance scale.
For fullscreen experience: https://invis.io/2SV7ND4ACBZ
Who are the users?
Users would be anyone who is 18+, visits Facebook daily to read the news, and wants to change their news bias.
What was my work?
As an MS HCI Student, my job was to work with teammates from the phase of defining a problem to the evaluation of a prototype with the users. For this project, I did the entire visual design and prototyping.
My design process
As a final project for our Interaction Design Practice (IDP) class, We needed to choose a broad topic that addresses real human problems. We wanted to work on something that focuses on disinformation, misinformation, and Echo-Chambers on social media".
Disinformation and misinformation
According to yonder.co, Disinformation is false information deliberately and often covertly spread in order to influence public opinion or obscure the truth. Misinformation is incorrect or misleading information inadvertently sent in order to influence public opinion or obscure the truth.
Why do we think this is a problem?
64% of U.S. adults believe that disinformation has caused a great deal of confusion about the basic facts of current events, and 23% said they had shared fabricated political stories themselves – sometimes by mistake and sometimes intentionally.
Confirmation bias helps to account for users’ decisions about whether to spread content, thus creating informational cascades within identifiable communities. Since they focus on their preferred narratives, users tend to assimilate only confirming claims and to ignore apparent refutations.
Findings from user interviews
Before working on this project, we did user interviews to know if they faced news bias on Facebook.
Facebook users who visit Facebook every day
Experts who have knowledge about news bias
Overall, the major themes that emerged from the participant interviews and informed our design direction were, Many people have become distrusting or skeptical of Facebook as a news source. Older people tend to believe Facebook is biased against their political leanings and are further affirmed in this notion from the media they consume on the platform.
Findings from secondary research
A couple of significant findings emerged from the secondary research. One key theme was that mistrust and disinformation are much more complex issues than we initially realized. Through our secondary research we found that the algorithms used by Facebook to curate relevant content for the user have also created strong echo chambers and filter bubbles.
Why did we choose "Balancing news bias in echo-chambers on Facebook" as a problem statement for our project?
As we began to narrow down our problem space, we felt that echo chambers, filter bubbles, and confirmation bias were truly complex, and real-world human problems that were worthy of tackling as we move forward with our project. We felt that if we solve this problem statement, it would effectively address our overall goal by helping the user be more aware of their political biases. Given that many users are unaware of their biases, we believe that raising their awareness will encourage them to seek out more diverse points of view or media sources that go against their political leaning.
After defining a problem and doing research, we started working on concepts for the media balance scale and bias board. Here are the final paper prototypes/wireframes that we worked on.
After finalizing the wireframes/paper prototypes, we started working on visual design. We are not redesigning Facebook. We integrated our designs/concepts into the Facebook application. Here are the final visual designs.
Media Balance Scale
On each news post, you will see the media balance scale. On hover, you can see what each letter means.
In the given scenario, according to the media balance scale, the news is central. If you want to know more about why it's central, then you can click on "C," and it will open a modal with details.
Bias Board - Dashboard
On the Dashboard, users can see what type of news they have consumed so far.
It gives weekly, monthly, and yearly reports, which includes news and it's bias rating, types of news user has liked, and finally, the graph on time spent on each news.
Bias Board - Balance your bias
On this page, users can see where they currently belong on a news bias based on our rating system.
If users want to balance their news bias, they can do it by dragging the blue component towards "C". It will recommend news pages to follow to centralize their news bias.
Usability Evaluations with Potential Users
Overall, all of the users who evaluated the product felt that it was a unique and effective solution. The primary concerns that they had were around the transparency and neutrality of the assigned ratings. Even with questions on transparency, the users felt that this tool could allow them a more informed and well-rounded opinion on political topics with friends and family they may disagree with politically. Lastly, they were skeptical about allowing Facebook to make a judgment call and determination about the political leanings as they do not view Facebook as a neutral party in the current political discourse. Looking ahead, it seems that being able to convince users to trust in the rating system would be the biggest barrier to product success.
Usability Evaluations with Experts
Overall, the key takeaway we had about task errors was that our attempts to seamlessly integrate our new features into Facebook made the new features blend in a bit and became difficult to locate. Perhaps including a “new feature tutorial” would be necessary to bring initial awareness to users of these new features. This tutorial would allow the features to remain seamless with the Facebook interphase normally but be brought to the user’s attention at the initial introduction of the feature.
Thank you for reading this project.
Copyright 2019, Ashish Durgude