The social networking giant banned Raja Singh just shortly after the Wall Street Journal reported that a pro-Modi executive at the company had blocked any attempt to censure him.
A member of India’s governing far-right Bharatiya Janata Party (BJP) has been banned from Facebook platforms for hate speech.
The company announced that it had banned T Raja Singh on Thursday for posts violating its policy on promoting violence and hate.
Singh, a BJP legislator in the state of Telangana, is accused of inciting violence and hatred against India’s Muslim community and has previously described Rohingya refugees as terrorists who should be shot. However, he denies that he ever had a page on Facebook.
Facebook flagged Singh as a dangerous individual, but the Wall Street Journal reported in August that a senior executive prevented the company from taking action against him and others on the Indian far-right.
Ankhi Das, Facebook’s Public Policy Director for South and Central Asia, is said to have prevented Facebook from banning individuals belonging to the BJP for fear that it would harm the company’s relationship with the party.
In a subsequent report, also by the Wall Street Journal, Das is reported to have expressed her support for the BJP and its leader, Indian Prime Minister Narendra Modi, in internal company communications.
Das celebrated Modi’s election victory in 2014 and disparaged his opponents in the rival Congress party.
“It’s taken thirty years of grassroots work to rid India of state socialism finally,” Das said, praising Modi as a “strongman”.
Colleagues in the company expressed concerns about the comments, saying they ran counter to Facebook’s pledge to remain neutral on political matters.
Das also shared a post that described India’s Muslims as a “degenerate community”, later apologising to her Muslim co-workers.
The revelations earned Facebook’s India chief, Ajit Mohan, an invitation to a grilling by the Indian parliament’s panel on hate speech.
Legislators from across the spectrum questioned the executive on Facebook’s policies on hate speech, with members on both sides accusing the company of bias.
Mohan said the company had removed 22.5 million hate posts globally and insisted that it does not play favourites when it comes to politics.
Since 2011, Facebook has provided training to a number of Indian political parties on how best to use the platform to mobilise supporters, according to the Wall Street Journal.
Despite its pledge to remain politically neutral and stamp out hate rhetoric, Facebook stands accused of not doing enough to prevent sectarian incitement and violence.
Officials working on prosecuting those responsible for the deadly anti-Muslim riots in Delhi in January this year, found that groups of Hindu extremists had used the Facebook-owned messaging application, WhatsApp, to coordinate their attacks.
A WhatsApp group with 125 members was used to share hate rhetoric and mobilise anti-Muslim rioters.
A 2019 investigation by the activist group, Avaaz, found that Facebook was being used to stir up hate against Muslims in the eastern Indian state of Assam.
The report detailed how, Bengali Muslims, who Hindu extremists describe as illegal infiltrators from Bangladesh, were described as “rats” and “rapists” in posts that were not taken down.
Avaaz said that of the 213 “clearest” instances of hate speech it brought to Facebook’s attention, only 96 were taken down by the company.
India is not the only country where Facebook is accused of providing a platform for hatred. The company also faces similar accusations of tolerating and amplifying white supremacists and the far-right in the US and European states.
In 2018, the UN blamed Facebook for accelerating the genocide of Rohingya Muslims in Myanmar, stating that the country’s Buddhist extremist government had used the platform to encourage the violence.
“I’m afraid that Facebook has now turned into a beast, and not what it originally intended,” said investigator, investigator Yanghee Lee.