Reddit bans subreddit group ‘r/DonaldTrump’ for 'repeated policy violations' after Capitol riot
LOS ANGELES - Reddit has banned the subreddit group "r/DonaldTrump" from its platform after repeated violations of its policies in recent days amid the violent breach by supporters of the president at the U.S. Capitol.
The popular social networking site offers hundreds of subgroups, but "r/DonaldTrump" was one of the company’s largest political groups in support of the president.
"Reddit's site-wide policies prohibit content that promotes hate, or encourages, glorifies, incites, or calls for violence against groups of people or individuals. In accordance with this, we have been proactively reaching out to moderators to remind them of our policies and to offer support or resources as needed," a Reddit Spokesperson told FOX TV Stations. "We have also taken action to ban the community r/donaldtrump given repeated policy violations in recent days regarding the violence at the U.S. Capitol."
Any person who visits the subreddit group may find a message stating, "This community was banned due to a violation of Reddit’s rules against inciting violence," with Reddit linking to further rules titled "Do not post violent content."
And this isn’t he first time Reddit has deleted similar accounts.
In June, Reddit banned the subreddit channel r/The_Donald, along with 2,000 other communities. This policy update came about three weeks after Black Lives Matter protests.
RELATED: YouTube announces channels posting false election claims will receive strike, suspension
The news comes as several other social media outlets have banned similar content, and just days after President Donald Trump’s social media accounts were blocked during a violent pro-Trump riot inside the U.S. Capitol.
On Thursday, YouTube announced that all channels posting false election claims will now receive a strike and temporary account suspension.
"Over the last month, we’ve removed thousands of videos which spread misinformation claiming widespread voter fraud changed the result of the 2020 election, including several videos that President Trump posted yesterday to his channel," Alex Joseph, a YouTube spokesperson, told FOX TV Stations.
On Wednesday, YouTube removed a video posted to Trump’s social media channel that YouTube said "violated our polices regarding content that alleges widespread fraud or errors changed the outcome of the 2020 U.S. Election."
RELATED: Facebook, Instagram indefinitely blocking President Trump, Zuckerberg says
Trump posted the video on multiple social media platforms asking a violent mob of his supporters to leave the Capitol building after they'd stormed it earlier on Wednesday. Trump spoke to his supporters in the video, saying, "I know your pain. I know your hurt. But you have to go home now." He went on to call the violent mob of his supporters "very special."
In addition, Twitter locked President Donald Trump out of his account for "12 hours" on Wednesday over the video and several other tweets, demanding that he delete tweets that violated the company’s policies.
Twitter swiftly moved on Wednesday to block the ability to reply, like, or retweet the president's video due to "a risk of violence," threatening to lock Trump out of his account permanently if he persisted in violating Twitter rules.
Meanwhile, Facebook went one step further, removing Trump's video from its platforms and suspending his accounts. On Thursday, Facebook announced that Trump’s Facebook and Instagram accounts will remain locked indefinitely.
"The shocking events of the last 24 hours clearly demonstrate that President Donald Trump intends to use his remaining time in office to undermine the peaceful and lawful transition of power to his elected successor, Joe Biden," Facebook CEO Mark Zuckerberg said in a statement Thursday morning.
Zuckerberg said Facebook has, for years, allowed Trump to use its platform consistent with its rules, but has removed content or labeled posts that violate its policy.