Our CEO Emma Dalmayne is on BBC2’s the Victoria Derbyshire show at 10.15am today.

Autistic Inclusive Meets Community Group AIM is at BBC Portland Place, London.

Our CEO Emma Dalmayne is on BBC2’s the Victoria Derbyshire show at 10.15am today.

Press release below, please watch:

Facebook in the firing line as MPs launch bid to force social media companies and administrators of online forums to tackle online radicalisation and hate

• Senior MPs from across Parliament come together to tackle online hate, fake news and radicalisation

• Lucy Powell MP accuses Facebook of “washing their hands of the hate and evil perpetrated on their site.”

Lucy Powell MP is today introducing the Online Forums Bill, backed by a cross-party group of senior MPs. The Bill is the first comprehensive attempt to tackle online hate speech, radicalisation, and disinformation on online forums, such as Facebook groups, which can have more members than some newspaper sites.

The Bill is sponsored by MPs from across Parliament including the Chair of the Treasury Select Committee, Nicky Morgan, the Chair of the Justice Select Committee Bob Neill, Education Committee Chair Robert Halfon, and other senior MPs including Jacob Rees-Mogg, David Lammy, Anna Soubry, Stella Creasy, Luciana Berger, Ruth Smeeth and Jess Phillips..

Lucy Powell will today argue that online groups often act as echo chambers, normalising extreme language and behaviours, which can translate offline and into the real world. Online hate and radicalisation can often go unchallenged in these groups and forums because there is no moderating factor, as there would be if these opinions were voiced in the classroom, the workplace or in the pub. Our laws have not kept pace with the huge explosion in online interactions, with safeguards failing to ensure adequate protections and challenge for extreme online acts.

The Online Forums Bill makes administrators and moderators of online forums legally liable for any hate speech and defamation which occurs within the groups they manage. The Bill will also ban secret groups on platforms such as Facebook and force moderators to publish the details of secret groups (that is, groups that are only visible to those invited to join) that are currently in existence. Online platforms such as Facebook allow users to create private groups for any purpose. Investigators have found groups replete with criminality, hate speech, and run by radicals seeking to evangelise to others. .

The ‘administrators’ of forums are supposedly responsible for their content and should therefore remove or report anything which breaches their community standards or the law, but this is not done in practice. Around half of global Facebook users are members of a Facebook group. That would mean, in Britain alone, an estimated 21 million people may be members of a group. Facebook allows groups to be ‘secret’, meaning they are invisible to all but their members. This locks out anti-hate charities and the police, allowing illegal and immoral activities to go unchecked.

Nearly 75,000 online crimes were reported last year, with 60 per-cent of those relating to harassment and stalking online. Online hate crime could be underestimated. Gallup’s 2017 Online Hate Crime Report found that 72 per-cent of respondents experiencing online hate crime had not reported their most recent experience. According to charities Tell MAMA and the CST, anti-Muslim and antisemitic incidents both hit record levels in 2017.

The Online Forums Bill would tackle this by making administrators legally liable for the content of the forums they manage, to ensure they take their role as the referees of these forums seriously. Administrators would be given a reasonable amount of time to delete hate and ban users, before they could be liable for action. The Bill would also make ‘secret’ groups public.

Lucy Powell MP, whose parliamentary constituency Manchester Central was the scene of the Manchester Arena terror attack last year, said:

“I’ve seen the danger of online hate and radicalisation in my own community. Social media is becoming a hateful place, with aggressive online discourse fuelling extremism and real world acts of violence. We need to act to ensure that people aren’t taken in by those who seek to use the internet to spread their toxic views. Some of these platforms normalise extreme views, creating a polarised and toxic environment, where hate goes unchallenged, with the thrill of status likes fuelling ever more offensive posts.

“Facebook and other sites shouldn’t be washing their hands of the hate and evil perpetrated on their sites. Instead administrators and moderators of online groups should be the front line against online evils, taking responsibility for the groups they run.”

– Activists & Advocates | YouTube Channel –

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.