Ex-Meta Engineer Sues Company, Claims Firing Over Gaza Content Handling
A former Meta engineer has accused the company of bias in its handling of content related to the Gaza conflict, alleging that he was fired for trying to address bugs that suppressed Palestinian Instagram posts. Ferras Hamad, a Palestinian-American who joined Meta’s machine learning team in 2021, filed a lawsuit in California state court, claiming discrimination, wrongful termination, and other misconduct related to his dismissal in February.
Hamad’s complaint alleges that Meta exhibited a pattern of bias against Palestinians. He claims that the company deleted internal communications that mentioned the deaths of employees’ relatives in Gaza and conducted investigations into their use of the Palestinian flag emoji. According to the lawsuit, no similar investigations were conducted for employees using Israeli or Ukrainian flag emojis in comparable contexts.
Meta has not yet responded to Reuters’ request for comment on these allegations.
Hamad’s accusations echo long-standing criticisms from human rights groups regarding Meta’s content moderation related to Israel and the Palestinian territories. These criticisms were also reflected in an external investigation commissioned by Meta in 2021.
The conflict in Gaza intensified after Hamas militants attacked inside Israel on October 7, resulting in the deaths of 1,200 people and over 250 hostages, according to Israeli figures. In response, Israel launched an offensive in Gaza, leading to the deaths of more than 36,000 people and triggering a humanitarian crisis, according to Gaza health officials.
Since the outbreak of the war, Meta has faced accusations of suppressing support for Palestinians. Earlier this year, nearly 200 Meta employees raised similar concerns in an open letter to CEO Mark Zuckerberg and other company leaders.
Hamad’s firing appears to be linked to a December incident involving an emergency procedure known within Meta as a SEV, or “site event,” which is designed to address severe platform issues. Hamad had noted irregularities in handling a SEV related to content restrictions on Palestinian Instagram accounts, which prevented their posts from appearing in searches and feeds.
One example from the complaint describes how a short video by Palestinian photojournalist Motaz Azaiza, depicting a destroyed building in Gaza, was misclassified as pornographic. Despite receiving mixed guidance from colleagues about the SEV’s status and his authorization to resolve it, Hamad’s manager later confirmed in writing that addressing SEVs was part of his job.
In January, a Meta representative informed Hamad that he was under investigation. Following this, Hamad filed an internal discrimination complaint and was fired a few days later. Meta stated that Hamad was terminated for violating a policy that prohibits employees from working on accounts of people they personally know, referring to Azaiza. However, Hamad maintains that he had no personal connection to Azaiza.
Hamad’s lawsuit highlights broader concerns about Meta’s content moderation practices and the company’s handling of sensitive issues related to the Israel-Palestine conflict. His case underscores the challenges tech companies face in managing content on their platforms, particularly during times of political and social unrest.
As the lawsuit progresses, it may bring further scrutiny to Meta’s internal policies and their implementation. The outcome of this case could potentially lead to significant changes in how tech companies address bias and ensure fair treatment of all users, regardless of the political context. Hamad’s situation also serves as a reminder of the importance of transparency and accountability in tech companies’ content moderation practices, especially when dealing with complex and contentious issues.
This lawsuit has garnered considerable attention from the public and human rights organizations, all keen to see how these critical issues will be addressed. The case against Meta might pave the way for more robust and impartial content moderation standards in the tech industry, promoting a fairer and more balanced approach to handling sensitive content from conflict zones.
Subscribe to our newsletter and never miss a story
Subscribe to our newsletter and never miss a story
Comments: 0