Facebook algorithms accused of “assistance” by Rohingya’s genocide in Myanmar
In 2017, algorithms for the formation of Facebook content “directly contributed to the murders and other atrocities” committed by the Muslim minority of the Rohingja Muslim minority. This is stated in the investigation of the International Organization of Amnesty International.
In the 74-page report, human rights activists brought a list of violations of Rohingzha rights over the past five years amid systematic persecution and apartheid. Since 2017, genocide has carried away the lives of more than 25,000 people.
According to the organization, META “significantly contributed” ethnic purge in Myanmar, having complained about disagreements and hatred.
“The massive distribution of messages that incite the cruelty and the restriction of the rights of Rohingya, as well as another anti -human content, added oil to the fire of long discrimination and significantly increased the risk of an outbreak of violence,” the report said.
At the end of 2016, the Armed Forces of Myanmar began a series of repressions in Rakhain, where most of the population of the Rohingya lived in crowded ghettos. Numerous human rights violations were documented – beating, killing, rape, arbitrary arrests and coercion to slavery.
Also, satellites recorded shots where the military burn thousands of houses. Atrocities, many of which were performed by radical Buddhist nationalists, aggravated in early 2017 and caused a wave of counterattacks of the rebels.
Then the country’s armed forces began the so -called “cleaning operation” – a genocide campaign, which included the use of artillery weapons, military helicopters and anti -personnel mines.
According to the report, social networks like Facebook helped extremist nationalists to pursue and dehumanize Rohingya due to the spread of a huge flow of content.
“Previously, the public followed their religious leaders, and when they, together with the government, began to share hateful statements on the platform, the consciousness of people has changed,” the school teacher Mohamed Ayasa quotes the school teacher as a school teacher.
According to the General Secretary of the Human Rights Economic Organization, ANIES Kallamar, even before the “exacerbation of the atrocities”, Facebook algorithms incurred the anger to minorities, which contributed to violence in the real world. While the military Myanans committed crimes against humanity, Meta made benefits from the echo-camera of hatred created by its toxic algorithms, she added.
Callamar also stated that the company needs to be held accountable and forced to compensate for the damage to all those who suffered from the consequences of its reckless actions.
As one of the “countless” examples for the dehumanization of Rohingya, the algorithms of the social network Amnesty International has highlighted the case with the post of Min Aun Khlin . In September 2017, the military leader said on Facebook that “there is absolutely no Rohingya race in Myanmar”.
The technical gigant blocked his account only a year later.
The Director of State Policy for developing META markets in the Asia -Pacific region Rafael Frankel said that the company supports efforts to What is The Sandbox? hold the military liable for their crimes against Rohingge.
“For this purpose, we voluntarily and legally provided data from the Investigative Mechanism of the UN on Myanmar and Gambia, as well as currently participating in the process of filing complaints in the OECR,” he said.
Recall that in November 2021, the Israeli army deployed an extensive system of recognition of persons to track the Palestinians on the western bank of the Jordan River.
In December 2020, Alibaba admitted that it developed AI technology to identify the Uyghur minority in China.
Subscribe to FORKLOG news at Telegram: Forklog AI – all news from the world of AI!