Social media reporting procedures

Do you have something else to signal?   Contact us

Facebook Reporting Mechanisms

Hate Speech

Facebook explains on this page how to report hate speech and abuse within most of its user features.

In nearly all cases there is a report function next to the post, page or message in the drop down menu which looks like or

Example

If you don’t have a Facebook account you can report through this online form

All reports are reviewed and a response is provided by Facebook. In cases of severe and threatening hate speech or bullying it’s advised to also report it to the relevant national bodies, find out how here


Cyberbullying

What to do in case of bullying is described here. In addition the Bullying Prevention Hub provides tips and resources for teens, parents and educators. It’s a joint initiative with partners working on bullying. It reaches out to both victims and witnesses that want to help.


Community Standards

The community standards describe the grounds for Facebook’s decision to take down content that gets reported. They include:

Twitter Reporting Mechanisms

Hate Speech

Twitter explains on this page how to report hate speech and abuse within most of its user features.

Most often the report function is available in the tweet, profile of media content under the dropdown menu with more options pictured as or

Example

All reports are reviewed and a response is provided by Twitter. In cases of severe and threatening hate speech or bullying it’s advised to also report it to the relevant national bodies, find out how here


Cyberbullying / Safety online

There are no specific guidelines provided in cases of bullying.

Twitter does provide tips to promote save use of its services, which covers (prevention of) bullying for:


Community Standards

The Twitter Rules provide the ground for taking down content or suspending accounts. This includes abuse behaviour, such as:

  • - Violent threats (direct or indirect)
  • - Harassment: to incite or engage in the targeted abuse or harassment of others.
  • - Hateful conduct: promote violence against or directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or disease.

There is a specific policy for dealing with content promoting child sexual exploitation and how to report it available here.

Instagram Reporting Mechanisms

Hate Speech

How to report on Instagram is depended on the platform you use:
Web browser (pc/laptop)
App (Application on phone/tablet)

In most cases the report function can be found in the drop down menu with more options, recognisable as or

If you don't have an Instagram account you can report through an online form.

All reports are reviewed and a response is provided by Instagram. In cases of severe and threatening hate speech or bullying it’s advised to also report it to the relevant national bodies, find out how here


Cyberbullying / Safety online

To report harassment or cyberbullying users are asked to use an online form.

Instagram provides tips on safe use of its services online, including for when supporting friend and parents.


Community Standards

The Instagram community guidelines describe how the company deals with violations of its terms of use. The guideline calls on the users to:

  • - Not violate copy rights
  • - Follow national law
  • - Respectful behaviour towards other members of the community or users.

Snapchat Reporting Mechanisms

Snapchat provides an online questionnaire to help its user find answers to different type of safety concerns including experiences of hate speech, bullying and harassment.

In most cases Snapchat recommends to:

  • - Block the attacker
  • - Review your privacy settings
  • - Report to the national law enforcement agencies (find out how here)


Safety online / Community Standards

The Snapchat safety centre provides safety tips for users and guidelines to parents and educators.

Its community guidelines describes expected behaviour of its users, it asks for:

  • - No nudity and sexual content
  • - Respect privacy
  • - No treats
  • - No harassment, bullying of spamming

Vkontakte (VK) Reporting Mechanisms

In their terms of use, Vkontakte (VK) provides no explanation how it deals with hate speech, cyber bullying and how to report it.

Users can report hate speech and cyberbulluing directly to VK. The report function looks like this and can be found next to the post in the right upper corner.

After pressing the button the post becomes marked as spam and there is an option for further specifications why you consider it spam, including:

  • - abuse,
  • - materials for adults only,
  • - drugs propaganda,
  • - child pornography,
  • - violence/extremism.

Depending on the post’s content, “abuse” and “violence/extremism” might be the most relevant possibilities to report hate speech and cyber bullying.

It is possible to report a ‘community’ of users to the VK support service team. Your report should explain how your personal interests are harmed and provide direct links to proof your argument. More explanation is given here.


Cyberbullying

To protect yourself against cyber bullying, VK recommends to add the bully to your black list. The blacklisted accounts cannot befriend, write messages or connect to your activities on VK any longer. VK explains the procedure here.


Community Standards

The Terms of use provide that the site adminsitration at its own discretion can move or delete any content or account. (Article 7.2.2.). The users are prohibited (Article 6.3.4) to upload, save, publish, share or use in any other way information that:

  • - contains threats, discredit, offend, denigrate the honor, dignity or business reputation of a person or infringe inviolability of personal life of other users or third party.
  • - propagandizes and/or support racial, religious and/or ethnic hatred, promote fascism or racial superiority;
  • - contains extremists’ materials.

×