Coming on the heels of a recent surge in updates and new features over the past few months, Instagram introduced a "false information" label that alerts users when a post includes misinformation. The pop-up, which will show up on both Instagram and Facebook, either lets you click on "see post" or "see why" to find out exactly why it was flagged in the first place. According to Facebook, the feature is meant to "protect the democratic process" and help stop election interference as the 2020 election creeps up.
"The labels below will be shown on top of false and partly false photos and videos, including on top of Stories content on Instagram, and will link out to the assessment from the fact-checker," Facebook wrote in an Oct. 21 press release titled "Helping To Protect the 2020 US Elections." It continued,
In addition to clearer labels, we’re also working to take faster action to prevent misinformation from going viral, especially given that quality reporting and fact-checking takes time. In many countries, including in the US, if we have signals that a piece of content is false, we temporarily reduce its distribution pending review by a third-party fact-checker.
So of course, this goes for your own posts and Stories too. If you try sharing something that's not quite true, a pop-up will read, "Independent fact-checkers say this post includes false information. Your post will include a notice saying it's false. Are you sure you want to share?"
In addition to enabling these pop-ups, Facebook says it's working to "fight foreign interference" and "increase transparency." The company plans on doing this by updating its policies to protect elected officials' accounts, showing confirmed owners of certain Facebook Pages, and incorporating a new US presidential spend tracker so users can see how much money has gone into each campaign ad that shows up on their screen.
Furthermore, Facebook says it's trying to address voter suppression on its platforms by blocking ads that imply voting is a waste of time, threaten violence, or put out misleading info. The press release states, "We remove this type of content regardless of who it’s coming from, and ahead of the midterm elections, our Elections Operations Center removed more than 45,000 pieces of content that violated these policies — more than 90% of which our systems detected before anyone reported the content to us."
These new efforts to push back against fake news and foreign interference come after the United States began realizing the extent to which Russia used social media to meddle in the 2016 election. In fact, according to the Daily Beast, up to 70 million Americans could have seen Russian propaganda accounts' Facebook ads. Now that Facebook has made moves to prevent such meddling, we'll have to see how things pan out after the 2020 elections wrap up. In the meantime, Instagram will let you know which posts you should second guess.