Researchers find a surge in links to controversial sites and redditors from “fringe” forums. The company’s CEO says Reddit is “digging deeply” on potential abuse.
Angry lawmakers on Capitol Hill grilled Facebook, Google and Twitter lawyers last month over their companies’ roles in the spread of misinformation during the 2016 presidential election.
Missing from two days of hearings? Reddit, the site that calls itself “the front page of the internet.”
According to data scientist and active Reddit user Rishab Nithyanand, Reddit also saw links to stories spreading misinformation and divisive content, the same problems experienced by the social media companies under scrutiny. Redditors, too, were exposed to more posts from users who also spend time in “fringe” forums. And they saw more offensive language, including cursing and slurs, according to research Nithyanand produced as part of his work at the Data & Society think tank in New York.
Those problems haven’t gone away.
What Nithyanand came to realize was that Reddit represents an important part of the story about the spread of misinformation or “fake news” across social media platforms. While it may seem that Reddit forums are insular, the site punches above its weight in influence on the internet, said Brian Solis, an analyst focusing on social media at research firm Altimeter. It’s the fifth most popular website in the US, according to analytics firm Alexa, with more than 250 million users.
“It is where a lot of information starts and spreads,” Solis said.
(On Tuesday, Reddit released its 2017 “best of” lists, noting that “redditors have been hard at work creating more content, conversations, and communities than ever before.” Its tallies for the year include more than 7 million Reddit Live thread updates, 900 million comments and 12 billion upvotes.)
To average Reddit users, these trends meant that every time they logged into their favorite political forum after December 2015, they were significantly more likely to see posts authored by redditors who also frequented forums like r/nazi, r/killingwomen or r/antifatart. (For the uninitiated, Reddit forums are called subreddits, and their names all start with r/.)
Or comments like this one, posted in r/The_Donald: “Her wall got breached without lube by a f***ing Trump Train on steroids with a Cruz missile attached. How’s that for f***ing bad.”
Or links to controversial news sources, like beforeitsnews.com, which in November published a story claiming US Marines were blocking a coup against President Donald Trump.
While these shifts happened in forums all over Reddit, Nithyanand found the activity was most intense in Republican-oriented subreddits, like r/Republican, r/Conservative, r/TedCruz, r/MarcoRubio and r/The_Donald. The increase was so much more pronounced on forums like those that Nithyanand thinks he might have stumbled on a planned push to change the tone and content of conservative political discussions across Reddit.
“We use this as evidence in our ongoing investigation of a coordinated misinformation campaign targeted at Republican subreddits,” he wrote in a paper made public this month.
Reddit declined to comment on the record about Nithyanand’s research.
On Nov. 1, the same day that lawyers for Facebook, Twitter and Google were in Washington answering questions, Reddit CEO Steve Huffman took part in an “Ask Me Anything” session with Reddit users and told them he and Reddit’s CTO were personally overseeing what happened on Reddit leading up to the election.
“I would love to be completely transparent about what we’re doing here, but given the sensitive nature of the situation, I have to be vague,” Huffman wrote. “We take both the integrity of Reddit and the US elections extremely seriously. We’re digging deeply. … When we have something to share, we will.”
Nithyanand said he was inspired to research the issue because he’s been using Reddit since 2013, mostly to get political news. He noticed a change in the tone on Reddit before the election and wondered what kinds of trends he could find in posts and comments over time.
Trained in data science at Stony Brook University, Nithyanand has researched communication and censorship on the internet, and his projects have been written about in Wired, Newsweek and Vox. He did his research on Reddit with fellow computer scientist Phillipa Gill and political science researcher Brian Schaffner, both of the University of Massachusetts Amherst.
A place to share everything
Reddit, founded in 2005, gets its influence by serving as the place for thousands of daily online conversations about every topic you can think of, from fandoms to hobbies to religion and politics.
Redditors follow internal rules enforced by each forum’s moderators and vote on each other’s comments. That means each subreddit develops its own tone and norms of behavior.
Men make up the majority of Reddit users — about 64 percent as of October. Reddit Vice President Zubair Jindali told eMarketer in 2016 that more than 87 percent of the site’s 200 million monthly visitors were millennials. Research from the Pew Research Center from 2016 showed that 58 percent of Reddit users are between the ages of 18 and 29.
The same research showed that 63 percent of redditors are white, at a time when the US adult population was 65 percent white.
Nithyanand isn’t the first to wonder what’s influencing political forums on Reddit. Political statistics news site FiveThirtyEight published analysis of r/The_Donald, a popular forum supporting Trump’s campaign, showing what it had in common with more niche and trollish subreddits. And the Pew Research Center published analysis of political conversations on Reddit in the run-up to the primary elections.
But neither looked at the trends Nithyanand and his co-authors sought out, including the spread of misinformation on Reddit.
Nithyanand examined 12 million posts and 332 million comments on Reddit, according to his paper. That included all posts from 124 political subreddits and a random sampling of posts and comments from nonpolitical subreddits. The political subreddits included nonpartisan forums like r/politics as well as party- and candidate-specific subreddits like r/Republicans and r/SandersForPresident.
What the researchers found is that visitors to Republican-affiliated subreddits were 600 percent more likely to see links to controversial sources after the start of the Republican primaries, and 1,600 percent more likely after the Republican National Convention in July 2016, than they were before the campaigns started.
What’s more, over 80 percent of all posts and comments about links to these sites were on Republican-affiliated subreddits before and after the election, Nithyanand said.
That includes links to stories like one posted on gopthedailydose.com with the headline, “Samuel L. Jackson on Donald Trump: ‘If That Motherf*cker Becomes President I Will Move my Black Ass to South Africa.'” (Jackson made the remark as part of a comedy bit on “Jimmy Kimmel Live”, but some websites repeated his joke as sincere.)
In addition to gopthedailydose.com, Nithyanand found links to sites like beforeitsnews.com and coed.com, which featured a story in 2015 saying a Black Lives Matter protest at Dartmouth College turned violent even though local and campus news reported the police said there were no complaints of violence.
Dennis Melnik, vice president at Coed Media Group, said the company doesn’t publish fake or fabricated news in its news section and noted that its stories have appeared in Google News since April. To source their stories, “we stick to established news sources: CNN, AP, Forbes,” Melnik said.
He didn’t respond to questions about the story about the protest at Dartmouth, but said anyone writing fake news at coed.com would be fired. Gopthedailydose.com and beforeitsnews.com didn’t respond to requests for comment.
Nithyanand found these links by searching Reddit for sources that have been identified as promoting conspiracy theories, as well as news that appears to be heavily biased or outright false, by OpenSources, a project run by a Merrimack College communications professor that vets online information sources.
The ‘fringe’ moves inward
The researchers also concluded that people posting on political subreddits were getting more radical.
Before December 2015, redditors who posted on political subreddits were less likely to frequent other forums where topics like violence and hatred toward women, minorities and LGBT people were discussed. As the primaries heated up, that changed, Nithyanand found.
Redditors active in the fringe groups Nithyanand identified increased their posts on subreddits affiliated with the Democratic party by 200 percent during the election — that covers the time period between December 2015 and Election Day on Nov. 8, 2016. On Republican subreddits, it was 6,600 percent.
There was a relationship between the number of comments made by redditors who were also active on hateful subreddits and the offensiveness of the comments on political subreddits. The more “fringe” commenters, the more offensive the discussions got, Nithyanand found.
Changing the conversation
As the presidential primaries ended with national conventions in July 2016, redditors in political subreddits were about 20 percent more likely to see offensive posts than users looking at nonpolitical subreddits. That number held steady until May 2017, when it jumped to 30 percent.
To look for trends in offensive language on Reddit, the researchers created a computer program that read through each post and comment to identify ones they considered “offensive” or “hateful.”
The language included cursing, slurs and insults. One example: a Reddit user with the name PipeSmokingGoat in May 2016 wrote, “OK seriously, it’s now an absolute f***ing shame that Trump isn’t actually Hitler. It’s a literal war on whites and the whites are too cucked to actually fight back.” (“Cuck” is an insult used on ultraconservative forums to ridicule people for softening their stances.)
Or another that read, “LOL WHAT NO IT ISN’T IT WAS WIKILEAKS YOU C***.”
Who’s looking at Reddit?
From one perspective, it might be reasonable for lawmakers to focus on social media networks besides Reddit. While Reddit holds the No. 5 spot in the list of the most popular websites in the US, data from the Pew Research Center in 2016 showed that only 4 percent of American adults are redditors. Compare that with the 68 percent of US adults who use Facebook, or the 21 percent of US adults who use Twitter.
Still, the site has influence, something Reddit has brought to the attention of advertisers. On a webpage designed for companies interested in advertising on the social forum, Reddit describes its user base this way: “This passionate audience has grown to become the most influential community online and covers a wide variety of interests.”
Reddit makes money off targeted ads, sponsored posts from brands and a premier subscription called Reddit Gold, which costs $3.99 a month or $29.99 a year. Gold users get perks like the ability to turn off ads, create gold-only subreddits and use a custom avatar.
Privately held by Advance Publications, Reddit doesn’t share its earnings data, but it announced a round of $200 million in investments in July, bringing its value to $1.8 billion (Facebook, by comparison, is valued at more than $519 billion. Twitter is valued at over $15 billion).
There’s a simple reason election manipulation on Reddit may have escaped lawmakers’ notice, said Michael Pachter, an analyst with consulting firm Wedbush: They’re probably not using the site.
“Yes, I’m surprised that Reddit hasn’t been included, but it skews much younger,” Pachter said. “It’s unlikely lawmakers are even aware of Reddit. If you want to have fun, ask a few of them.”
Sen. Mark Warner of Virginia, the ranking Democrat on the Senate Intelligence committee, has mentioned Reddit as a site the investigation into online election manipulation should look at. When asked whether Reddit has been asked to testify before the Senate Intelligence committee, a spokeswoman for Warner said in an email that “there’s been some initial outreach.”
Spokespeople for three representatives and senators who led the other two hearings didn’t respond to multiple requests for comment on whether Reddit was invited to the hearings or is being looked at as part of the larger investigations into social media manipulation. A spokeswoman for Rep. Mike Conaway, a Texas Republican who chairs the House Intelligence committee, declined to comment.
Investigators will need to carefully think out what they should do if they learn from Reddit that the site was used to spread misinformation, said Solis, the social media expert.
But as of now, he said, “they haven’t even realized the extent of the problem.”