Facebook accused of inciting hatred by hosting material glorifying terrorist groups
Under-fire Facebook is hosting hate-filled material glorifying terrorist groups and photographs of children in coffins killed during the Troubles.
An audit of paramilitary content on the social media behemoth's site carried out by The Belfast Telegraph showed it has pages dedicated to dissident republican murder squads and "heroes" from the Provisional IRA.
Facebook is also carrying loyalist paramilitary material, links to offensive sectarian songs and videos, as well as messages urging people to murder PSNI officers.
The American social media giant, whose CEO Mark Zuckerberg is the fifth richest person in the world with an estimated worth of £48bn, is also hosting graphic images of scarred children's bodies after they were killed in Troubles outrages.
Sir Jeffrey Donaldson yesterday attacked Facebook for allowing the publication of sectarian material, which he said represented an illegal incitement to hatred and support for terrorism.
The DUP MP, whose cousins Sam and Alex Donaldson were killed in IRA attacks, said: "I'm very concerned Facebook is hosting material that supports terrorism. Free speech is one thing, but direct abuse and hatred is another."
Mr Donaldson also called for those who were posting terror-related material on Facebook to be prosecuted.
He added: "The law is quite clear on incitement to hatred and support for terrorism, and Facebook needs to have a clear line about what is acceptable and what isn't. There is no doubt that social media today is potentially reaching as many if not more people than mainstream media and therefore it needs to be properly regulated.
"If there is clear evidence where people have broken the law in terms of incitement to hatred or support for terrorism then the law should follow through and they should be prosecuted. The law also needs to be clarified on Facebook's position as a company - it is a publisher and should be treated as such."
Critics of Facebook have long called for it to be regulated in the same way as mainstream broadcasters and publishers.
Many of the hate-filled messages have been on Facebook pages since 2011, including dozens of posts calling for former Celtic manager Neil Lennon to be killed.
One reads: "Hang Neil Lennon, hang him high." It was posted two months after Mr Lennon, was bombarded with death threats and had bullets posted to Celtic football club's office.
Police officers are also targeted on Facebook, with one user issuing sinister threats, adding "God willing I get the chance to dance on their f****** graves".
There are also hundreds of sectarian comments on Facebook, most too offensive to be published in this newspaper.
And there are scores of photos of balaclava-wearing republican terrorists brandishing automatic weapons on so-called training exercises and patrols.
The terrorist material comes to light a day after it was revealed Facebook is refusing to delete videos and images of violent death, abortion and self-harm because it does not want to censor its users.
It allows posts detailing how to murder women, with one of the corporation's manuals telling staff the comment 'To snap a b****'s neck, apply your pressure to the middle of her throat' does not break its rules.
Abuse such as 'F*** off and die' is also permitted because Facebook does not deem it a 'credible threat'.
It also advises that some photos of kids suffering non-sexual physical abuse or bullying are ok unless there is an element of 'sadism or celebration'.
And animal abuse snaps can be shared unless extremely upsetting.
Live-streaming self-harm bids get the nod too as Facebook 'doesn't want to censor or punish people in distress'.
The firm's attitude to offensive material was exposed when its rulebook on content was leaked to The Guardian.
Facebook also permits videos of violent deaths, as it claims it can highlight mental issues, and abortions - as long as there is no nudity. But one example of a post to be deleted is 'Someone shoot Trump' because it is classed as a credible threat.
Facebook, which has two billion users, is under pressure to control extreme content and porn.
But many staff say rules are confusing and they are so busy they often have just 10 seconds to make a decision.
There are only 4,500 Facebook moderators to police the accounts of nearly two billion users, and they are instructed to delete controversial material only in certain circumstances.
The Guardian said it had seen more than 100 internal training manuals, spreadsheets and flowcharts on how Facebook moderates issues such as violence, hate speech, pornography and racism.
The firm's Monika Bickert told the newspaper: "We feel responsible to our community to keep them safe and we feel very accountable. It's absolutely our responsibility to keep on top of it. It's a company commitment. No matter where you draw the line, there are always going to be some grey areas."
Facebook yesterday said they were looking into the material after the Belfast Telegraph made them aware it was on their pages.