State and local officials in at least five states have recently sparred with Facebook while trying – and sometimes failing – to remove false election information or post accurate notifications of their own on the platform, according to interviews and emails obtained by USA TODAY.
Last December, for instance, the state election board in Oklahoma told Facebook that proposed language for a public service announcement was inaccurate.
The board feared that the registration deadline reminder – written by Facebook – could lead to people missing the deadline and it urged the company to adopt different wording.
“We don’t want someone to be disenfranchised because they misunderstand a Facebook reminder,” Misha Mohr, the agency’s public information officer, told the company in a series of emails. “We think Facebook’s heart is in the right place on this. We just think it is important that Oklahomans get correct information.”
But for almost a month, Facebook refused to change the notification because its legal department had already signed off on the language.
“We were very intentional about the language we selected, but do appreciate the feedback,” a manager said in an email.
The election officials said they were dumbfounded because the language they wanted to use came directly from the state election laws. (Facebook eventually agreed to retool the language before the notification launched.)
“We could not believe the struggle we had to put up the correct language,” Paul Ziriax, the secretary of the Oklahoma election board, said in an interview.
Tussles like this playing out behind the scenes shed light on the sometimes fraught relationship between one of the most powerful companies in the world and the nonpartisan government agencies tasked with establishing facts about the coming elections in a sea of online fiction.
“Facebook has a history of failing to identify and stop adversarial Russian trolls, but now it will block a Chicago election authority from placing (an) ad about when and where people may use early voting,” Jim Allen, a communications consultant for the Chicago Board of Elections, emailed Facebook in March 2019.
The company would not allow the city to post election information unless Allen registered the post as a political advertisement, which Mother Jones first reported last year. It was one of three times his agency has clashed with Facebook over similar issues since October 2018, emails show.
“It’s a public-information effort on when and where people may cast a ballot, pure and simple,” Allen wrote. “This is objectionable in every possible way.”
Facebook remained steadfast. “I understand your opposition,” a company official responded in an email. “But for now, these are the rules.”
During the 2018 midterm, the Louisiana Secretary of State’s office discovered someone had copied vote counts from a live test on the state website and posted them on Facebook as if they were the final election results. When a public information director alerted Facebook and sent screenshots, the company refused to take the post down because it had not violated the platform’s standards and told the officials they could instead report it to Facebook’s fact-checkers.
“How (is this) not voter suppression?” the director replied. “No one is going to trust the process if they think we’re posting election results before 8 p.m. on Election Day when we’ve told them we don’t. It’s frustrating to report misinformation only to get this message back about your community standards.”
Kevin McAlister, a policy communications manager at Facebook, denied widespread issues with local and state offices.
“We value our relationships with secretaries of state and state election officials,” he said in a statement. “We routinely work with these officials to fight voter suppression on our platforms and we’ve set up a dedicated channel for state elections officials to report to us any issues so that we can quickly investigate them.”
McAlister said the company has taken a number of other steps since 2016 to shore up the platform and prevent the spread of false information, including a new 40-person team to monitor suspicious activity around the primaries and remove posts that violate Facebook policies. The company has also added features to more prominently label posts that have been fact checked, McAlister said.
To be sure, many of the election officials USA TODAY interviewed said they’ve had positive experiences with Facebook and appreciate cases in which the company quickly removes misleading posts or correctly notifies voters about deadlines.
“I really think they have good intentions,” said Ziriax. “But words mean things.”
Facebook has historically bucked at policing the content posted on their platform. But after revelations that Russians funded fake political ads during the 2016 elections, Facebook and other social media companies faced pressure to change their policies to prevent voter suppression and ban misleading information about when and where to vote, for example.
Unlike Twitter, Facebook has been under fire for its policy of not fact checking political advertisements, which critics say allows politicians to lie and then pay the company to proliferate their lies.
At the same time, Facebook’s power over what millions of voters are exposed to alarms some election administrators who feel they’ve been sidelined by policies they often don’t understand.
“They want to be a publisher but they don’t want the responsibilities to go with it,” Allen said in an interview. His office in Chicago eventually relented and registered its posts as political advertisements, even though the agency is legally required to remain nonpartisan.
“It’s a disturbing position,” said Allen.
Earlier this month, the National Association of State Election Directors and National Association of Secretaries of State met in Washington, D.C. to discuss threats to election security, including what many consider the most pernicious: lies and misleading information spread across social media.
During one question-and-answer session, the secretaries of state in West Virginia and Louisiana raised specific concerns to Facebook representative Khalid Pagan about the company’s handling of election misinformation in their home states.
Mac Warner, the West Virginia Secretary of State, told Pagan his office needed Facebook’s help deciphering between fake videos and real ones. He also said the platform was being used to intimidate candidates and simply removing the posts was not enough to deter those looking to sow distrust and confusion in the months leading up to November.
“This can’t continue to happen in this election,” Warner said.
The Rhode Island Secretary of State, Nellie M. Gorbea, asked for assurances about the new tools the company was rolling out ahead of November’s election, including a security program to help election officials secure their Facebook accounts.
In each case, Pagan, who had given a presentation with statistics and charts to show the company’s successes in removing false posts, said he didn’t know enough specifics to comment or punted the answers to others in the company who were not on the panel.
McAlister told USA TODAY in an interview that focusing on the few negative experiences takes away from all the positive work the company has done with these same election offices.
“It’s not like it was all combative,” he said, referencing compliments Facebook received during the conference. “They were saying nice things.”