I think it was Sun Tzu that once said, “to know your enemy, you must become your enemy.” Then again, I doubt that Sun Tzu had to deal with Russian trolls undermining one of the world’s largest democracy.
The environment for US political discourse today is… harsh? Yes. Let’s go with that. Harsh. It’s an environment where the loudest opinions (or the ones that elicit the strongest reactions) are the building blocks that shape the current conversation. It’s an environment were all political discourse usually ends up in a yelling contest, where both sides devolve into what can only be described as a Mad Max Beyond Thunderdome type of situation; only with less facts and more blood. It’s an environment that fosters selection bias for news (bad). It’s an environment that fosters misinformation (very bad). It’s an environment that let a Russian led disinformation campaign effect the 2016 presidential contest (VERY, VERY BAD!).
As we all know by now, Russian propagandists purchased ads on multiple social media networks – most notable being Facebook – to promote a variety of issues, in hopes of creating dissent and chaos among the American electorate, leading up to the 2016 presidential contest. The FBI found that the ads were targeted in trying to divide groups among racial, political, and cultural backgrounds. Specifically, these groups would create divisive ads that focused on topics like the Black Lives Matter protests, immigration, and the 2016 presidential election.
To understand how effective these Russian trolling campaigns were in creating information chaos within online and broader American society, here are some quick bullet points from a USA Today report that went through roughly 3,500 Facebook ads by the Russian-based Internet Research Agency:
- One out of every 3,517 ads, created by Russian trolls, were seen by Americans.
The most prominent ad from the Internet Research Agency had reached 1.3 million impressions and 73,000 clicks on Facebook.
Internet Research Agency ads targeted specific individuals based on keywords found on user profiles, like targeting pro-Trump ads to keywords “Blue Lives Matter” or “America First.” On social media platforms like Facebook, this had created a higher chance that the ad’s targeted audience would engage with them.
As I was looking through the over 3,000 ads that were put out by Democrats in the US House Intelligence Committee, there was one ad that stood out, but I couldn’t figure out why. It was this Bernie Sanders Coloring Book Ad.
After tedious research – to be read as an afternoon of Facebook stalking – I realized it was an ad that was being passed around by some of my Facebook friends back in 2016, before Hillary Clinton had clinched the Democratic nomination. While I wouldn’t categorize these group of Facebook friends as very political – they were people that I knew back in college, but haven’t really talked to since – it was very obvious they were in the “Bernie 2016” camp. One of them had even joined the “LGBT United” group (that we now know today was being ran by Russian trolls).
Looking through these ads, you can understand why someone would be swayed to click on them. They are very clickbait-ish by nature. Scrolling through the over 3,000 ads created by the Internet Research Agency, you see the same headlines titled “You Won’t Believe What Happens in This Video of [X]” or “Only True [X]ers Will Relate to This.” The subjects themselves are relatively benign, but it’s only when you read further down in the post’s description where language that promotes tribalization begin to surface. Because of that, their ability to spread on social media makes sense. After all, how many people do you think take the time and read the description of a post that they casually Liked or Shared on Twitter or Facebook?
It’s here that I should tell you that based on… well… everything, it’s pretty obvious that Russian propagandists plan to do this again in future elections. It’s also here that I should tell you that both the Trump administration and Facebook are NOT taking any real steps in preventing this from happening again.
For the Trump administration, particularly President Donald Trump, look at any claims that the Russians interfered in the presidential contest as a referendum that he didn’t “WIN ‘UGE” in 2016. For the administration, to even recognize this Russian propagandist campaign would put into question President Trump’s 2016 presidential victory. So they are treating it like it didn’t even happen. (Or in legal circles, this is also known as the “Na-Na-Na-I-Can’t-Hear-You-So-It-Doesn’t-Exist-Na-Na-Na” defense.)
As for Facebook, while they have promised that measures will be taken to stop the Russian trolling groups from doing this again, in reality, nothing will happen because around 98% of Facebook’s global revenue was generated through online advertising. My apologies, I should re-write that factual tidbit with the proper inflection that it deserves; AROUND 98% OF FACEBOOK’S GLOBAL REVENUE WAS GENERATED THROUGH ONLINE ADVERTISING!!! That means to implement any real measures to stop Russian trolls from spearheading a misinformation campaign like this again, they would need to fundamentally alter their basic income stream. Their ONLY income stream.
If it isn’t clear by now, between the Trump administration refusing to acknowledge that Russians interfered with the 2016 Presidential campaign and Facebook making little to no effort in implementing substantial changes in how they target paid ads to their users, nothing is really going to change. Well, not anytime soon anyway.
So, considering how successful the Russian troll campaign was and that the system looks to still favor their clickbait-ish ads, maybe the best course of action isn’t necessarily to push back against this? Or as Jack Donaghy once told Liz Lemon on 30 Rock:
“It was 1994, and I was ice climbing when I fell into a crevasse and hurt my leg. There was only one way out, so fighting every natural instinct I have, I did the thing I hated the most. I climbed down into the darkness. And when I came back to camp, I went to the person who cut my line and said, ‘Connie Chung, you saved my life.’”
It’s here that I’m assuming that that you have decided to drown your better angels in the dark void that you have now sunken into. It becomes clear, very quickly, that there’s only one real answer here: that we create our own Russian troll ad! So if you’re going to create your own Russian troll ads, you might as well know how to do them well. So, here are three easy to follow steps in creating your very own!
Step #1: Pick a Target Audience
It’s imperative that your ad inflame only a particular group on the Internet. This can best be achieved by narrowing your ad to a specific target audience. When the Internet Research Agency were creating ads back in 2016, they would specifically target particular demographic groups by either race, party affiliation, or political issue. Some of their examples of targeted groups included:
- Donald Trump supporters
Hillary Clinton supporters
Bernie Sanders supporters
Black Lives Matter supporters
Blue Lives Matter supporters
People who are against gun control
People who are for gun control
Basically if there is a divisive issue in American politics – and trust us when we say, there most certainly are – all you have to do is pick a side. Also, don’t be afraid to be creative, there are no wrong answers here. Is President Trump making disparaging remark regarding [INSERT MINORITY GROUP HERE]? Instead of looking at it as him debasing the office of the presidency, think of it as him recommending a targeted group for your Russian ad!
Step #2: Deciding What to Post
Once you decided on the particular group to target, now comes the most important part: picking the right piece of media to post. This can be anything from a particular photo to a YouTube video to even a highlighted passage. While your choice of media is wide open, you should have a specific checklist of questions to ask yourself when doing so:
- Would a family member, over the age of 65, either forward this to you with the message “this is what I was talking about last Thanksgiving”, share this on their Facebook profile, or get angry after viewing this post?
Can someone look at the post and jump to very broad conclusions about topics that would otherwise be very complicated/nuanced?
Does this elicit a gut reaction? If not, how can you make sure it does?
Am I presenting as little factual information as possible about a particular situation and/or topic? Could I be providing even less factual information?
Is this presenting a measured or practical view on current events? And if so, is there a way to frame the post so it doesn’t?
Step #3: TAGS, TAGS, TAGS, TAGS, TAGS, TAGS!
On almost every social media platform – whether that be Facebook, Twitter, Instagram, ect – targeted advertising always is based on specific keywords to target certain audiences. For example, when the Internet Research Agency created an ad that depicted two Black men being handcuffed by police with the caption “their crime was driving while black”, it was targeted to groups that were tagged under “Martin Luther King Jr.”, “Malcom X”, and “black history.” Another example, from the same Russian backed troll group, had an ad that centered on “Blue Lives Matter” and tried to target groups that were tagged under “The Thin Blue”, “Police Wives Unite”, and the “Officer Down Memorial Page.” It’s important to be as generalized as possible with these tags, even though these tags do nothing but create broad stereotypes of these groups. But hey, nuance in this current political hellscape landscape is overrated. Or so I’ve heard.
With the system refusing to fight against the general onslaught of misinformation and with qualities like political integrity being antiquated thoughts of a bygone era, as the old saying goes, if you can’t beat them…
Penzenstadler, Nick, et al. “We Read Every One of the 3,517 Facebook Ads Bought by Russians. Here’s What We Found.” USA Today, Gannett Satellite Information Network, 13 May 2018, www.usatoday.com/story/news/2018/05/11/what-we-found-facebook-ads-russians-accused-election-meddling/602319002/.
“Social Media Advertisements.” The Permanent Select Committee On Intelligence Democratic Office, democrats-intelligence.house.gov/facebook-ads/social-media-advertisements.htm.
“Facebook Ad Revenue 2009-2017.” Statista, www.statista.com/statistics/271258/facebooks-advertising-revenue-worldwide/.
Stahl, Jeremy. “Mueller Indicts 13 Russians for Interfering in the 2016 Election.” Slate Magazine, Slate, 16 Feb. 2018, slate.com/news-and-politics/2018/02/mueller-indicts-13-russians-for-interfering-in-the-2016-election.html.
Franceschi-Bicchierai, Lorenzo. “Here Are 14 Facebook and Instagram Ads That Russian Trolls Bought to Divide Americans.” Motherboard, Motherboard, 1 Nov. 2017, motherboard.vice.com/en_us/article/a377ej/facebook-instagram-russian-ads.