Ironically enough, people have been turning to social media to talk about the bad feeling Netflix's documentary The Social Dilemma has left in their stomach. With testimonies from ex-employees of Google, Facebook, Instagram, Twitter, and more, the documentary takes a deep dive into social media's impact on mental health, politics, and the economy. But is The Social Dilemma accurate, or just entertainment?
"I'm not trying to encourage people to delete social media, as opposed to just be aware of what it's doing and why and how," the film's director, Jeff Orlowski, told First Showing back in March. Orlowski added that as soon as some of the first viewers of the 93-minute documentary walked out of the Sundance screening, they shared with him that they planned to change their relationship with their phones.
Though issues like bullying, digital addiction, the spread of fake news, and surveillance are very real, the future of social media might not be as grim as the doc leaves viewers feeling.
"Social media isn’t all bad," cybersecurity expert Kristina Podnar tells Bustle, explaining that she felt slightly defensive after watching it. "In a time of the pandemic, social media has afforded us connections in ways that we couldn’t have done otherwise. We just need to understand the good aspects that come from it, what we are giving up for using the platforms, and how they need to change," she says.
Social Media Has Not Taken On A Life Of Its Own
"As humans, we’ve almost lost control over these systems. Because they’re controlling the information that we see, they’re controlling us more than we’re controlling them,” Sandy Parakilas, a former operations manager at Facebook, says in The Social Dilemma.
The documentary makes it seem like social media has run away from its creators, but Podnar says that this theory is "naïve and wrong." To imply that what's become of social media is a surprise, is to imply that the people who made it didn't know what they were doing, she says. "People were getting paid to work at the social media companies — it is a business. Businesses exist for profit, so of course the platforms were always going to do things beyond connecting people [and] helping individuals find long-lost relatives," Podnar says. She adds that it's the nature of businesses that provide a free service to explore how to monetize your engagement, which has led things like targeted advertising. "To now say that platforms took on a life of their own is shirking of responsibility by the very architects of the system," she adds.
Podnar explains that though the extent to which some of social media's negative effects have come into play — like the rapid spread of misinformation, cyber bullying that leads to suicide, and data collection that makes users feel unsafe — might have been more impactful than anticipated, each of these effects were in the cards from the get-go. What's more, Podnar says that this could have been avoided if early developers spent more time mapping out parameters and guardrails for social media in general.
Social Media Is Not Doomed
Just because social media feels like a threat in 2020 doesn't mean all platforms will always be dangerous, according to Podnar. "Social is a bit like the early days of car safety," she says. "Safety belts and horns were not incorporated when the car was developed. It took a lot of issues and harm before we recognized the need for security standards."
People are now starting to understand the impact that social media has on society, which is a good thing. Podnar believes that though the documentary's tone suggested that it's all downhill from here, social media will ultimately become regulated and safer for all. "There is no focused regulatory body nor enough citizen-driven focus (including financial spend and behavior) to drive the right behavior by the platforms," she says. But that doesn't mean that the regulations won't come.
The film's director acknowledges this nuance. "It's not about technology as a whole, it's about when technology is designed for us versus for somebody else. And there's this new class of technology that is designed for somebody else, and we are the resource that’s fueling it," Orlowski told First Showing.
While global regulation programs like General Data Protection Regulations, California Consumer Privacy Act, and Protection of Personal Information Act are helpful in protecting private data collection, Podnar says they are still young programs, and are not as effective as users need them. Further, data protection regulations only work to address user's privacy, not to patrol the safety of the platforms. "So missing is what is being done in terms of addiction to platforms, manipulation of humans, impersonations and identity hijacking, truth in content, as well as the ethical measures that need to be in place in order for platforms to function with integrity."
People Don't Understand Their Own Privacy
It's true that people don't know enough about what they agree to share when they use an app, or what is at risk by sharing that data, Podnar says. It's important for people to understand that their privacy is not out of their control, and they can limit what they share by being specific about what kind of apps they interact with. "We don’t have control of our data on social media today, but we do have control on whether we are on social media and what we share."
To interact with apps without sharing more than you're willing, it requires education. "The documentary rightfully points out that people just don’t know what is out there and how platforms are using their data. Everyone should know and have transparency into what is being done to their privacy and data."
If you're concerned about your data, Podnar suggests the following:
- Remember that if the service you are using is free, you are the product. "Protecting your privacy and well-being costs money, and if it is free, then that speaks to the value you are getting," Podnar says. Expect limited protections.
- Acquaint yourself with the platform. Go to settings and learn what you have control over. "Limit as much information about you being collected and opt out of as many things as possible," she says.
The People Who Made The Apps Aren't Necessarily The Most Fit To Fix Them
The ex-social media executives interviewed for the documentary couldn't quite come up with a fix for social media's issues. But they did say they felt compelled to be involved in the solution. "We have a moral responsibility, as Google, for solving this problem," Tristan Harris, an ex-design ethicist at Google, said.
But Podnar thinks the issue is much larger than the companies, or even the country. "Having North American and Northwestern Europeans 'rearchitect' platforms for the rest of the global population means that we will still get it wrong," she says. "This issue is a global issue and need to be addressed with global perspective."
Ideally, Podnar says the UN and other NGOs and non-profits would take on this role (as part of the discussion around human rights). At the very least, regulators from around the world should be working on these regulation changes together, says Podnar, not just a tight-knit group of people who are already familiar with the apps.
AI Is Not A Villain
Podnar agrees with the documentary that platforms have more information about us than they should have, and that it is being fed into systems without enough human oversight. "The documentary gets it right that AI and predictive analysis is being widely used, but not just by social media platforms," Podnar says. She explains that AI is used in many fields. AI can detect everything from medical issues in scans to your interests on TikTok. She adds that it's a very dynamic technology that's not necessarily a bad thing, if proper parameters are in place. Most importantly, AI is not some personification of technology rising up against humanity. "Humans built the logic, and humans can alter the logic that drives the AI," she says.
But the biggest point that Podnar feels the documentary left out was the fact that tech companies can align their business models to use AI and data for good. "You could use data to screen individuals for signs of impending suicide or mental event and prevent that or treat the individual," Podnar lists as an example. "Alignment is needed, rather than an all-or-nothing proposition that the documentary seems to present."
If you can't stop thinking about The Social Dilemma and want to learn more about technology's impact on society, cybersecurity, and the future of connectivity, Podnar suggests queueing up Screenagers, Screenagers 2, The Great Hack, Terms and Conditions May Apply, Lo and Behold, and Algorithms: Secret Rules of Modern Living.