Experts Explain Why ‘Doin’ it for the Gram’ Makes Us Do Dumb Things
July 13, 2018Social media mishaps have been in the news a lot recently. On July 3, the #PlaneBae saga erupted on Twitter, was celebrated briefly, then quickly condemned as an invasion of privacy. The same day, a trio of adventure YouTubers tragically died when they were swept over a waterfall in British Columbia. On July 6, an Instagrammer known for posting daredevil shots of his feet dangling off cliffs and skyscrapers fell to his death from a New York City building. And just this week, news broke that an influencer was bit by a shark in the Bahamas in pursuit of the perfect pic.
The deluge is a tragic mix of poor judgement, risky behavior, and terrible luck. But the thing they all have in common is that these people were doin' it for the 'gram. It begs the question: are we becoming so thirsty for content that we're willing to ignore our better judgement just to get likes?
To get some perspective on how the human psyche factors into all this, VICE asked some experts to weigh in. Irina Raicu is the director of the Internet Ethics Program at Santa Clara University's Markkula Center for Applied Ethics. Erin Vogel, PhD is a postdoctoral fellow in the Department of Psychiatry at the University of California, San Francisco researching social media. Here's their take.
Questions and answers from separate interviews have been edited and combined here for clarity and brevity.
VICE: What’s your professional reaction to the #PlaneBae saga?
Erin Vogel: I think we’re living in a weird age, in terms of privacy and what we can expect from strangers. We’re used to the possibility of being in the background of other people's photos—you know, things that happen naturally when you’re out in public. But with social media, we don’t really know when other people are using our lives for entertainment. This idea of live tweeting or recording other people for Instagram or Snapchat stories is something that just sort of happens now, and I think social norms are still being formed around what’s okay and what's not. It’s really anxiety-provoking for people when they think about it.
Irina Raicu: The reason the [backlash to #PlaneBae] was sonegative is because norms were violated here. The idea that nobody cares about privacy seems to be debunked by this story. The Golden Rule—the idea of, “Do unto others…”—is one reason people responded the way they did. They would not want that done to them. I wonder whether the woman who did the live tweeting thought about that and really considered those people as individuals who have the same rights and needs as she does. Would she have been comfortable if somebody did that to her? The response shows that most of us would probably not be.
Do you think the endless pursuit of content compromises our values?
Vogel: Yeah, I think in some ways it does. Especially now that more and more people are becoming influencers and YouTubers, the bar is that much higher and the competition is that much greater, and people have to do increasingly drastic and interesting things if they want to stay relevant.
Raicu: [The #PlaneBae] situation treated people as a means to an end, which ethicists have long said is not an ethical thing to do. Whatever the intent was, even if the intent was just entertainment, it wasn’t something that was supposed to help those people.
There’s the secondary issue of her capitalizing on this, but even if she hadn’t—even if it was just something she tweeted to her friends and it didn’t go as viral as it did, maybe stayed to 300 Twitter followers and not hundreds of thousands—this still feels like a problem.
What makes us abandon our better judgement?
Raicu: In the context of [#PlaneBae], I think it’s one thing to overhear a conversation. No one expects you to plug your ears. But no one expects you to start publicizing it without the people involved realizing that you’re doing it.
I wonder whether [...] there’s been a desensitization—that we are starting to see other people as content rather than individual human beings with rights and needs. As everything becomes a reality show, we forget that reality shows are actually staged, and we think that daily life is a reality show when it’s not. Most people choose not to participate in reality shows and don’t want to be forced to be the stars of one. That’s kind of how it feels in a story like this.
Vogel: Social media is a really quick and easy way to get reinforcement—to get positive reactions or even negative attention for the things we post, and that can be really tempting for a lot of people. They end up using social media very frequently, or worrying a lot about what they’re posting or how many followers they have, because they get used to having that sort of attention.
More than ever, people who would not otherwise have the opportunity to be famous or have a large following have a chance to do that, and they get really rewarded if they’re successful at it. Instagram influencers and YouTubers can make a ton of money just by narrating what they’re doing in their everyday life. That’s something we really haven’t seen before, where ordinary people can get famous for doing ordinary things.
What does an ethical presence on social media look like?
Raicu: I would not presume to know! I think that’s a good question and I think my virtue ethicist friends would say that Aristotle talked about practical wisdom. There aren’t any hard rules that you can follow all the time. It depends on the circumstances and the context. Social media has been used in conflicts to distribute video about human rights violations, so in some situations, the importance of getting information out might outweigh privacy concerns.
One tool that we have at the Center is called the Framework for Ethical Decision Making. It asks people to look at a potential decision through five different ethical lenses: rights, fairness and justice, the common good, utilitarianism (which means benefits versus harm), and virtue ethics.
Most of us believe we have a right to privacy. I think that’s one thing people are responding to. In terms of fairness, I think this feels unfair—the fact that there are parties that don’t know this is happening, while someone behind them is recording and publishing their conversations. It goes back to issues of consent online: we want companies and governments to ask for our consent, because that relates to our autonomy as human beings.
In terms of the common good, that’s defined as conditions under which society can function well. You know the conversations that we have sometimes when we travel? A discussion maybe lasts the length of a flight, then you never see that person again. Sometimes they can be really in-depth and meaningful, with people opening up to each other and then separating. Sometimes you remember them for the rest of your life, and there’s something valuable there, right? If we all start to worry that anything we say might be recorded and posted, then laughed at or admired or whatever, it’s going to reduce those kinds of interactions, I think. It’s going to make people leery, and that’s a drawback in terms of the common good.
Through a variety of these lenses, [the #PlaneBae decision] feels wrong. The framework suggests there are problems from a variety of ethical perspectives. We make ethical decisions all the time, and we’re not going to use the framework every time, but if people aren’t sure what the right thing to do is, then putting that decision through these ethical evaluations will at least make sure you don’t do anything that you just adamantly regret later. I realize that it’s unrealistic to do that kind of analysis all the time, but my colleagues who are virtue ethicists would argue that ethics is like a muscle that you develop, like athletes who develop muscle memory, and over time you will make better decisions.
What are the mental health costs of unwittingly becoming famous online?
Vogel: It can be very distressing for people, probably some more than others. People who generally like to keep their lives private probably wouldn’t think talking to a stranger on a plane would turn into a major invasion of privacy. So it’s really new, unusual, and scary for them.
Being doxxed can be distressing for anyone—the idea that all this information about you is out there, and people online tend to make really harsh judgements and say things they wouldn’t say in person. There’s a lot of research on anonymity [and groupthink] online and what it can lead people to do. Things can spiral out of control really quickly.
How can someone heal or protect themselves when they get doxxed?
Vogel: I think many of us could benefit from a social media detox of sorts, especially from taking a break when something really negative happens. Someone shouldn’t have to delete all their social media, if it’s something they enjoy, just because other people are harassing them. I think it's a shame that that happens a lot. But i think that can be the best way to go sometimes: take a break and then start over. But there aren’t really any great solutions to the problem yet, and I think that’s something we as researchers need to address and that the legal system is going to be dealing with more and more.
What’s the responsibility of platforms like Twitter, Instagram, Snapchat, and Youtube?
Raicu: I think they would be excellent educational vehicles. I think that if they used circumstances like this one as case studies to address their audience. If they wanted to show that they are not just about generating numbers and attracting eyeballs, which they’re getting criticized for now, and that they do want to create … you know like twitter has been talking about generating healthy conversations, right? I think it would be really interesting if you opened your Twitter feed the next morning and there would be some discussion from twitter about whether this was a healthy conversation or not.
I don’t expect them to be able to recognize in real time when something like this is happening and try to address it by censoring content or something, but I think there might be a missed opportunity for them to really directly educate or interact with their users in a conversation about what is the right thing to do.
You don’t want to figure it out after the fact, or see the same thing repeated. There was a couple years back who were breaking up on a roof and somebody overheard it and tweeted it. It feels like these things happen over and over again, where there would be an opportunity for companies to say OK how do we want our platform to be used, and how do we prevent these types of things, and how do we get our users to treat each other with more respect, which is what i think this boils down to.
What is a practical solution to these issues?
Raicu: Some of the discussions happening in schools now are interesting. There’s a whole push for media and digital literacy, and helping students—as though adults have this down, and we certainly don’t—understand what constitutes false information, or trusted sources, and all of that. Some schools are having lessons in digital citizenship: how are you going to be a good participant online? Having those conversations with peers is, in a way, more effective, because if you see your friends reacting to something in a certain way, let’s say you didn’t think it was a privacy invasion but other kids your age explain why they felt that way, that might be more impactful than anything else. So encouraging those kinds of conversations would be really good.
If the point is that we don’t think this woman was thinking about the other passengers as real people with their own needs and concerns, and by the way, depending on the circumstances, if you do that you might subject vulnerable people to real harm. There could be people who are trying to escape domestic violence, or who knows what. So you have to really recognize the other people as whole people just like yourself, and i think doing that among kids would be a good way to fight against this kind of desensitization where it’s all about can you get people to hit the like button.
Sign up for our newsletter to get the best of VICE delivered to your inbox daily.
Follow Kara Weisenstein on Twitter.