Former Obama spokesperson calls for Zuckerberg to resign as he promotes name change

Facebook needs to change its CEO, not its name: The public has lost trust in the network and the only way to regain it is for Zuckerberg to stand down, professor of communications says

A former spokesperson for President Barack Obama’s Treasury Department is calling for Facebook CEO Mark Zuckerberg to resign This comes after more revelations have been made public about the company’s failure to stop disinformation Kara Alaimo claims that Zuckerberg ‘has done little to try to fix’ the problems with the social media behemoth The new allegations, submitted anonymously under penalty of perjury, echoed the claims made by fellow whistleblower Frances HaugenIn the most dramatic line of the affidavit, the former employee anguished over Facebook’s inability to act quickly to help curb racial killings ‘It’s clear that [Zuckerberg] lacks the moral inclination or the capacity to solve these problems,’ Alaimo writes 



<!–

<!–

<!–<!–

<!–

(function (src, d, tag){
var s = d.createElement(tag), prev = d.getElementsByTagName(tag)[0];
s.src = src;
prev.parentNode.insertBefore(s, prev);
}(“https://www.dailymail.co.uk/static/gunther/1.17.0/async_bundle–.js”, document, “script”));
<!–

DM.loadCSS(“https://www.dailymail.co.uk/static/gunther/gunther-2159/video_bundle–.css”);


<!–

A former spokesperson for President Barack Obama‘s Treasury Department is calling for Facebook CEO Mark Zuckerberg to resign after more revelations have been made public about the company’s failure to stop disinformation.

Kara Alaimo, now serving as an associate professor in the Lawrence Herbert School of Communication at Hofstra University, claims that Zuckerberg ‘has done little to try to fix’ the problems with the social media behemoth.

A new whistleblower affidavit submitted by a former Facebook employee accuses the social media giant of prioritizing profits over their due diligence to combat hate speech, misinformation and other threats to the public. 

The new allegations, submitted anonymously under penalty of perjury, echoed the claims made by fellow whistleblower Frances Haugen, who delivered a scathing testimony before Congress this month on Facebook’s moral failings.

Kara Alaimo, now serving as an associate professor in the Lawrence Herbert School of Communication at Hofstra University, says that the first step to fixing problems at Facebook would be for CEO Mark Zuckerberg to resign

Zuckerberg, who founded Facebook, has a net worth is $122 billion, making him the 5th-richest person in the world

In the most dramatic line of the affidavit, the former employee anguished over Facebook’s inability to act quickly to help curb racial killings in Myanmar in 2017 as military officials used the site to spread hate speech. 

The op-ed comes as Facebook considers changing the company’s name after all the bad publicity.  

The company is also hinting at plans for a so-called ‘metaverse’ – a virtual reality version of the internet where people can game, work and communicate.

The tech giant’s CEO Mark Zuckerberg has been a leading voice on the concept, which would blur the lines between the physical world and the digital one.

It could allow someone to don a virtual reality headset to make them feel as if they’re face-to-face with a friend, despite being thousands of miles apart and connected via the internet.

Whistleblower Frances Haugen delivered a scathing testimony before Congress earlier this month on Facebook’s moral failings.

Alaimo says the company needs much more than a cosmetic change.

‘The place to start is with Zuckerberg’s resignation,’ she wrote in the op-ed.

Zuckerberg, Alaimo said, either can’t fix Facebook’s issues or won’t.  

‘It’s clear that he lacks the moral inclination or the capacity to solve these problems,’ she wrote. ‘Either way, he’s got to go. The company should announce a new chief executive with all possible haste. It should be someone thoughtful and committed to transparency about how social media is harming our society — who has the will and competence to put the platform on a very different course.’

Simply put, Alaimo argues Zuckerberg’s company, which has nearly three billion users, has lost the public trust.  

‘It’s because the public has lost faith in Facebook. And rightly so. For all the family photos shared or funny videos consumed that the company has made possible, ‘Facebook’ is now also a name associated in recent years with misinformation, privacy violations, the spread of hate and autocracy.’

Alaimo calls Facebook’s reputation ‘bankrupt’ and says the name will do little to restore public trust. 

Haugen has said that Facebook holds culpability for the January 6 Capitol insurrection

The company said it more or less ‘stumbled’ onto the riot, which resulted in the deaths of five people

As the world begins to truly contend with just how dangerous social media platforms can be, Facebook’s reckoning has been kicked into overdrive following former staffer Frances Haugen’s shocking allegations that the company has long known about its platform’s toxic effects on society — and has done little to try to fix them.

‘The only way for Facebook to restore that trust is to change its leadership and address the actual issues that have justifiably prompted so much concern.’

This is the first time the company has faced such accusations since the internal memos released and testimony given by Haugen. 

Haugen and the new whistleblower also submitted the allegations to the Securities and Exchange Commission, which oversees all publicly traded companies. 

In the SEC affidavit, the anonymous ex-employee alleges that Facebook officials routinely undermined efforts within the company to fight misinformation and hate speech out of fear of angering then-President Donald Trump and his allies. 

The former employee said that on one occasion, Facebook’s Public Policy team defended a ‘white list’ that exempted the alt-right media company Breitbart News and other Trump-aligned publishers from Facebook’s ordinary rules against spreading fake news.   

Alaimo says ‘he company should announce a new chief executive with all possible haste’

Facebook is blamed by Haugen as knowing Instagram harmed young girls’ body image and even tried to brainstorm ways to appeal to toddlers by ‘exploring playdates as a growth lever.’

Ultimately, Alaimo argues, ‘changing a name won’t change reality.’    

The complaints come after Haugen’s testimony before Congress in early October, where she claimed Facebook promoted divisiveness as a way to keep people on the site, with Haugen saying the documents showed the company had failed to protect young users.

It also showed that the company knew Instagram harmed young girls’ body image and even tried to brainstorm ways to appeal to toddlers by ‘exploring playdates as a growth lever.’

‘The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed,’ Haugen said at a hearing.

Haugen, who anonymously filed eight complaints about her former employer with the US Securities and Exchange Commission, told 60 Minutes earlier this month: ‘Facebook, over and over again, has shown it chooses profit over safety.’

She claimed that a 2018 change prioritizing divisive posts, which made Facebook users argue, was found to boost user engagement.

That in turn helped bosses sell more online ads that have seen the social media giant’s value pass $1 trillion.

‘You are forcing us to take positions that we don’t like, that we know are bad for society. We know if we don’t take those positions, we won’t win in the marketplace of social media,’ Haugen said.

She also blamed Facebook for spurring the January 6 Capitol riot. 

‘We’ve been fueling this fire for a long time’: Facebook employees raged as they scrambled to delete posts that incited violence on Jan 6 – but the firm didn’t allow them to target some groups calling for violence

Whistleblowers at social media giant Facebook are saying the company didn’t do enough to stop the spread of misinformation in the days and weeks leading up to, as well as during, the Capitol riot on January 6. 

New internal documents provided by former Facebook employee-turned-whistleblower Frances Haugen provide a rare glimpse into how the company appears to have simply stumbled into the Jan. 6 riot. 

Reports say that in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinformation and inciteful content. 

Emergency actions – some of which were rolled back after the 2020 election – included banning Trump, freezing comments in groups with a record for hate speech, filtering out the ‘Stop the Steal’ rallying cry and empowering content moderators to act more assertively by labeling the U.S. a ‘Temporary High Risk Location’ for political violence.

At the same time, frustration inside Facebook erupted over what some saw as the company’s halting and often reversed response to rising extremism in the U.S.

Former Facebook employee Frances Haugen (pictured above) has provided new details on the company’s response to the Capitol riot

Five people died in the insurrection, which attempted to stop American electors from certifying the 2020 Presidential Election for Joe Biden 

Within Facebook, there were complaints that the company hadn’t done enough to stop the spread of misinformation extremism that people believe led to the riots

‘Haven´t we had enough time to figure out how to manage discourse without enabling violence?’ one employee wrote on an internal message board at the height of the Jan. 6 turmoil. 

‘We´ve been fueling this fire for a long time and we shouldn´t be surprised it´s now out of control.’

It´s a question that still hangs over the company today, as Congress and regulators investigate Facebook´s part in the Jan. 6 riots.

It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participants spent weeks vowing – on Facebook itself – to stop Congress from certifying Joe Biden´s election victory.

The documents also appear to bolster Haugen´s claim that Facebook put its growth and profits ahead of public safety, opening the clearest window yet into how Facebook´s conflicting impulses – to safeguard its business and protect democracy – clashed in the days and weeks leading up to the attempted Jan. 6 coup.

This story is based in part on disclosures Haugen made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen´s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.

What Facebook called ‘Break the Glass’ emergency measures put in place on Jan. 6 were essentially a toolkit of options designed to stem the spread of dangerous or violent content that the social network had first used in the run-up to the bitter 2020 election. As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadsheet analyzing the company’s response.

‘As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,’ Haugen said in an interview with ’60 Minutes.’ 

A Capitol police officer was one of the five people who died as a result of the riot 

New internal documents provided by former Facebook employee-turned-whistleblower Frances Haugen provide a rare glimpse into how the company, after years under the microscope for the policing of its platform, appears to have simply stumbled into the Jan. 6 riot

Facebook founder Mark Zuckerberg has faced heavy criticism for how his platform controls information and data

An internal Facebook report following Jan. 6, previously reported by BuzzFeed, faulted the company for having a ‘piecemeal’ approach to the rapid growth of ‘Stop the Steal’ pages, related misinformation sources, and violent and inciteful comments.

Facebook says the situation is more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content, as it did on Jan 6. The company said it´s not responsible for the actions of the rioters and that having stricter controls in place prior to that day wouldn´t have helped.

Facebook´s decisions to phase certain safety measures in or out took into account signals from the Facebook platform as well as information from law enforcement, said spokeswoman Dani Lever. ‘When those signals changed, so did the measures.’

Lever said some of the measures stayed in place well into February and others remain active today.

Some employees were unhappy with Facebook’s managing of problematic content even before the Jan. 6 riots. One employee who departed the company in 2020 left a long note charging that promising new tools, backed by strong research, were being constrained by Facebook for ‘fears of public and policy stakeholder responses’ (translation: concerns about negative reactions from Trump allies and investors).

Videos of rioters entering the Capitol building often went viral themselves through the social media giant 

Facebook was founded in 2004, originally meant to be used as network of Harvard University students

Research conducted by Facebook well before the 2020 campaign left little doubt that its algorithm could pose a serious danger of spreading misinformation and potentially radicalizing users 

‘Similarly (though even more concerning), I´ve seen already built & functioning safeguards being rolled back for the same reasons,’ wrote the employee, whose name is blacked out.

Research conducted by Facebook well before the 2020 campaign left little doubt that its algorithm could pose a serious danger of spreading misinformation and potentially radicalizing users.

One 2019 study, entitled ‘Carol´s Journey to QAnon-A Test User Study of Misinfo & Polarization Risks Encountered through Recommendation Systems,’ described results of an experiment conducted with a test account established to reflect the views of a prototypical ‘strong conservative’ – but not extremist – 41-year North Carolina woman. 

This test account, using the fake name Carol Smith, indicated a preference for mainstream news sources like Fox News, followed humor groups that mocked liberals, embraced Christianity and was a fan of Melania Trump.

Within a single day, page recommendations for this account generated by Facebook itself had evolved to a ‘quite troubling, polarizing state,’ the study found. By day 2, the algorithm was recommending more extremist content, including a QAnon-linked group, which the fake user didn´t join because she wasn’t innately drawn to conspiracy theories.

A week later the test subject’s feed featured ‘a barrage of extreme, conspiratorial and graphic content,’ including posts reviving the false Obama birther lie and linking the Clintons to the murder of a former Arkansas state senator. Much of the content was pushed by dubious groups run from abroad or by administrators with a track record for violating Facebook´s rules on bot activity.

Those results led the researcher, whose name was redacted by the whistleblower, to recommend safety measures running from removing content with known conspiracy references and disabling ‘top contributor’ badges for misinformation commenters to lowering the threshold number of followers required before Facebook verifies a page administrator´s identity.

Among the other Facebook employees who read the research the response was almost universally supportive.

‘Hey! This is such a thorough and well-outlined (and disturbing) study,’ one user wrote, their name blacked out by the whistleblower. ‘Do you know of any concrete changes that came out of this?’

Facebook said the study was an one of many examples of its commitment to continually studying and improving its platform.

Another study turned over to congressional investigators, titled ‘Understanding the Dangers of Harmful Topic Communities,’ discussed how like-minded individuals embracing a borderline topic or identity can form ‘echo chambers’ for misinformation that normalizes harmful attitudes, spurs radicalization and can even provide a justification for violence.

Examples of such harmful communities include QAnon and, hate groups promoting theories of a race war.

‘The risk of offline violence or harm becomes more likely when like-minded individuals come together and support one another to act,’ the study concludes.

Charging documents filed by federal prosecutors against those alleged to have stormed the Capitol have examples of such like-minded people coming together.

Prosecutors say a reputed leader in the Oath Keepers militia group used Facebook to discuss forming an ‘alliance’ and coordinating plans with another extremist group, the Proud Boys, ahead of the riot at the Capitol.

‘We have decided to work together and shut this s-t down,’ Kelly Meggs, described by authorities as the leader of the Florida chapter of the Oath Keepers, wrote on Facebook, according to court records.

Advertisement

Loading

Leave a Reply

Your email address will not be published. Required fields are marked *

Follow by Email
Pinterest
LinkedIn
Share