Facebook is automatically generating pages for terrorist groups, including Islamic State and Al Queda, a whistleblower claims.
A report by the whistleblower released through the National Whistleblower Centre has identified in excess of 200 auto-generated pages for Islamic State and dozens of pages for other terrorist groups including Al Qaeda and Ansar Al Sharia.
The report’s findings raise questions about the effectiveness of Facebook’s artificial intelligence (AI) systems which are designed to filter out content by the Islamic State.
Its disclosure comes as Facebook’s global head of policy management, Monika Bickert, is due to give evidence to a US Senate Committee on commerce, science and transportation investigation into the spread of online terrorism and the efforts of technology companies to control it.
Bickert told the House Homeland Security Committee hearing on 26 June that Facebook “removes any content that praises or supports terrorists or their actions whenever we become aware of it”.
According to the report, simple searches on Facebook uncovered hundreds of profiles praising terrorist groups, written in Arabic and English.
They include an automatically generated page called “I love Islamic State”, which features an image of the Facebook thumbs up with the terrorist group’s logo in the centre.
“The page shows clear support for terrorism, in violation of Facebook’s policies, but was created by Facebook and remains active,” the whistleblower’s report claims.
In one case, an individual using the name Mohammed Atta created a profile page claiming he worked for “Al Qaidah”, attended terrorist school in Afganistan and studied at the “University of Master Bin Laden”.
Facebook automatically created a series of supporting pages from entries in his profile, which remained active until at least 2 September 2019.
Facebook has been aware of the problem since at least May 2019, when the whistleblower released an initial report, sent to the US regulator, the Securities & Exchange Commission, revealing that terrorist groups and neo-Nazis were expanding their reach through automatically generated Facebook pages.
Facebook did not attempt to delete the pages identified by their URLs in the report until 25 June 2019, the day before Bickert was due to appear before a congressional hearing on extremism, the report claims.
Lawmakers wanted to know why Facebook had not deleted all of the pages identified by the report, including a page for Al Qaeda in the Arabian Peninsula with 217 followers.
Bickert told the hearing that Facebook was proactive in its identification of terrorist content. “Importantly, we do not wait for Isis or Al Qaeda to upload content to Facebook before placing it into our internal detection systems. Instead, we proactively go after it,” she said.
The latest whistleblower report questions whether Facebook has made adequate efforts to tackle automatically generated terrorist pages since the June hearing.
The report cites one automatically generated page for the terrorist group Ansar al_sharia, which had over 2,400 likes and was used by young radicals to stay connected with each other, that dates back to at least 2015.
“Facebook is essentially providing free advertising to terrorist groups. It is clear the company has made no effort to change its auto-generation feature since its last hearing before US lawmakers,” it said.
Facebook said it prioritises deleting content that violates its policies. “Our priority is detecting and removing content posted by people that violates our policy against dangerous individuals and organisations to stay ahead of bad actors,” it said.
“Auto-generated pages are not like normal Facebook pages, as people can’t comment or post on them, and we remove any that violate our policies. While we cannot catch every one, we remain vigilant in this effort.”