Earlier this year, Meta, the parent company of Instagram and Threads, changed how users would see political content.
While the ongoing rollout aimed to “deliver the content users find most valuable,” on Instagram and Threads, the new recommended default settings was met with criticism from big tech companies, to Texas lawmakers and content creators, due to lack of transparency for posts that specifically fall in line with political content.
Previously, under the direction of CEO Mark Zuckerberg, Meta used user feedback to temporarily reduce the distribution of political content in Facebook News Feeds. This was based on negative feedback from users suggesting that political content took over the News Feed and negative feedback on posts about political topics.
“What’s interesting about social media is, it’s really democratized,” said Justin Sinkovich, the associate director of the School of Business and Entrepreneurship. “What we’re seeing here is a response to potential misinformation because there’s not a gatekeeper per say.”
Sinkovich, who specializes in marketing including digital marketing with a background in social media and media distribution, said that social media empowers voices outside of normal media which has led to the spread of misinformation.
“It’s been an interesting new paradigm in politics,” Sinkovich said. “Even foreign countries have stepped in and had a hand in disseminating the information that they want to put onto social media to help drive election results.”
Users will currently still see political content if they choose to follow a content creator who posts anything included within the limitations; however, Meta platforms will not recommend political content on Reels, Threads, the Explore page or in-feed recommendations. These controls have also rolled out beginning Sept. 4, 2024 on Facebook.
Junior film and television major Aliya Brown said social media is essential for creatives/creators to grow and receive important information.
“I know some people use social media as a way to escape from that [political content],” said Brown. “I don’t know, I feel like it still shouldn’t be a thing, because social media should be the social world of the whole world; I think all content should be shown, I don’t think anything should be censored.”
Meta posted an update in September in its transparency center stating: “We use AI systems to personalize the content you see based on the choices you make; we aim to avoid making recommendations on Facebook, Instagram, and Threads that could be political in nature.”
Sinkovich believes that algorithms on social media, including those generated by AI systems, creates a snowball effect leading to the polarization of voices.
“If you can define an audience more leaning in a certain way you can often pick up on different preferences and demographics that they represent,” said Sinkovich. “An audience that believes in this political view is more likely to have these sorts of preferences, and they also can predict that they might be more interested in certain products and goods that you can sell them.”
In response to the political filter, Accountable Tech, a nonprofit organization that advocates for structural reform to make the internet safer, conducted a study on algorithms for five prominent accounts, including Democrat Hillary Clinton, that regularly post about political or social topics.
“There was a reduced reach of a considerable amount, over 50% decline in reach over about a three month period while this policy was going into effect,” said Zach Praiss, the campaigns director at Accountable Tech.
Social media plays an important role in news consumption for young adults. According to Pew Research roughly 54% of adults use Instagram, with 20% seeking out information regularly on the platform.
“It’s such a crucial election, and I want to know what is happening in the election, and how polling numbers are going,” senior musical theater major Ines Manuel said. “If you’re only going by Instagram, then you don’t really know how close the race is.”
Meta’s political preferences filter also sparked concern for users and content creators due to an error that overrode past settings. The glitch caused posts to remain limited even after opting in for political content each time the app reopened.
Andy Stone, the communications director for Meta posted in Threads in June apologizing to users and identifying the error a day before the first 2024 debate between President Joe Biden and former President Donald Trump.
“Gen Z is mostly on social media for their news and to have this glitch that can alter their perception–it is not helping,” Manuel said. “It almost seems manipulative, you feel like you have the freedom of choice, but you actually don’t.”
Sophomore and music major Manu Lopes said that the content preferences make her feel insecure as an international student.
“The information that is being taken [away] from us takes the power to help as well, our agency is gone,” Lopes said.
The use of broad language such as “social topics” and Meta’s refusal to clarify what constitutes “political content” also raises concern for many content creators.
Brown said she finds the content preferences policy discouraging for creators.
“At one point, you’re getting good views and good engagement. Next second, you’re like, where everybody go?” said Brown.
In response, Accountable Tech teamed up with creators to send an open letter to head of Instagram Adam Mosseri, demanding clearer guidelines. The letter argued that the political content filter limits inclusion and participatory democracy.
“We had over 200 creators signing on to a letter to Meta, to reverse this decision to limit political content by default for all users,” Praiss said. “Instead give users the option to limit political content if they choose, but not to default people into that setting without notice or transparency.”
Brown also argues that many content creators will be affected by the content limit who sometimes rely on engagement for compensation.
“Social media can be a job, it’s important to use that as, like another source of income,” said Brown. “When I can’t get that reach, it takes those opportunities away.”
In addition to political content, the “content preferences” tab also reduces fact-checking content by default meaning that users may see false or altered content unless they opt out. While the setting offers users the option to reduce more, Meta does not specify what is included with reducing more false or altered content. Altered content that’s been reviewed by a third party fact checker won’t be removed from social media unless it violates community standards, which is separate from flagged false content.
Sinkovich says that the algorithm threatens the spread of misinformation due to automated responses of what posts fit into the filtered political categories.
“These automations aren’t in any way perfect, they’re flawed, and with that can have unintended consequences,”said Sinkovich.“Perhaps these platforms are aware and they are willing to proceed with the way that that operates, despite the drawbacks and unintended consequences.”
While the content preferences feature aims to make user feeds less cluttered and more pleasant for users, political figures, social groups and content creators like Brown worry that the filter will limit them from disseminating valuable information to users who primarily receive news and content online.
“From a content creator perspective, when I’m posting on different social medias, you think ‘Oh, it’ll do good on this app, or maybe do good on this app’; knowing it’s like that on all of them, it’s kind of like, what’s the point?”
Copy edited by Manuel Nocera