How to report hate speech on social media
Illustration by Vanessa Purpura / The Daily Gamecock
Both users and non-users can report hate speech on social media platforms and to the university. Here's how:
Bias and hateful events or messages can be reported to the university through the Bias and Hate Incident Form. Whether the incident involves a current student, employees of the university or members of the community, anyone can submit the form.
Once the bias report is submitted, the university will respond within 72 hours. But not every incident will be investigated.
The report is reviewed by Cliff Scott, the director of the Office of Equal Opportunity Programs (EOP); Julian Williams, the vice president for diversity, equity and inclusion; and dean of students Marc Shook before it is investigated by the EOP office. Depending on the case, the Office of the General Counsel may also get involved.
These administrators decide if the alleged behavior constitutes “a violation of the university's policies prohibiting unlawful discrimination and harassment,” university spokesperson Jeff Stensland said in an email.
If the behavior is considered a violation, it will then be investigated by the EOP office, and if not, it will be directed to another university department to respond.
All Snapchat users are required to agree to a terms of service contract before the account can be used.
In Snap Inc.'s community guideline policy, it states that users should not post content that "demeans, defames, or promotes discrimination or violence."
Users can report abuse in the app or by filling out a form through Snapchat Support.
In the app, users can report accounts, snaps or stories by pressing and holding down on the account or content.
“We condemn racism and have zero tolerance for it on Snapchat. Our Community Guidelines clearly prohibit content that incites racial violence, hate speech, and discrimination of any kind,” a Snap spokesperson said on June 2 in an email statement regarding Jackson's post. “We encourage anyone who sees something like this to always report it so our Trust and Safety team can take action, which can include removing the offending content and if appropriate, terminating the account."
Depending on the violation of the policy, tweets which violate Twitter rules may be limited in people's ability to see it or be removed. Accounts may be placed on a read-only mode where users are limited in tweeting or suspended.
To report an account or list, users can select the three dots at the top right hand of the account that the user wants to report. Tweets can be reported by selecting the down-facing arrow in the top right corner of the tweet.
Users and non-Twitter users can also fill out a form on the Twitter website to report the content.
Facebook and Instagram
Both Instagram and Facebook will remove content containing hate speech.
"We do not allow content that attacks people based on their race, ethnicity, national origin, religious affiliation, or their sexual orientation, caste, sex, gender, gender identity, and serious disease or disability. If we find content that violates these policies, we will remove it," a Facebook company spokesperson said in an email on June 19.
On Instagram and Facebook, hate speech can be reported within the app and online.
For users with an account, posts and profiles can be reported by tapping the three-dot icon.
Messages that do not adhere to community standards on Facebook can be reported by someone who doesn't have an account.