Twitter shuns Trust and Safety Council as Musk incites harassment
Twitter shuns Trust and Safety Council as Musk incites harassment
Yesterday, Twitter’s head of security Ella Irwin was scheduled to meet with Twitter’s independent Trust and Safety Council for an “open conversation and Q&A” on Zoom. AP News reports. Instead, it was received by members of the city council an email completely ignoring
Twitter has officially announced that it is in a “new phase” when it comes to trust and security.
“We are reviewing how best to bring external perspectives into our product and policy development work,” the council said in an email signed “Twitter”. “As part of this process, we have decided that the Trust and Security Council is not the best structure for this.”
Founded in 2016, the council brought together dozens of experts on bullying, safety, human rights, suicide prevention, child sexual exploitation and policy development. Today, rather than clearly defined priorities that determine the best guidelines for content decisions on Twitter, now-deleted visitors Trust and Safety Council website it will encounter an error message: “There is nothing to see here.”
The decision comes days later three board members resigneddeclaring in one letter “a Twitter ruled by diktats is not a place for us.” The Washington Post reported “Many members were already on the verge of resigning,” one board member, Larry Magid—CEO of ConnectSafely—suggested that Twitter acted immediately to save face. “By disbanding, they fired us instead of letting us go,” Magid told The Post.
In the letter to the city, Twitter suggests that the city’s decision-making was going astray. In the past, He reported by cable Former Twitter CEO Jack Dorsey was criticized for ignoring the board, and when current CEO Elon Musk took over, Musk reportedly did the same, preferring to make decisions through informal Twitter polls, apparently filled with bots. instead consider discussions with outside experts. Twitter said in its letter that Musk will no longer be the sole decision-maker, and that Twitter will “move faster and more aggressively than ever before” by providing “more timely input” through “bilateral or small group meetings.”
Irwin recently confirmed that Twitter would rely more on automated content removal, and a @TwitterSafety thread published this weekend shed light on Twitter’s latest efforts to limit the reach of hate speech on the platform, rather than eliminate it. For example, instead of removing tweets containing slurs, the thread says, Twitter wants to make an effort to identify appropriate contexts for slurs, tracking all slurs that appear on Twitter but limiting what is deemed inappropriate. (Irwin did not respond to Arsen’s tweets to clarify how this limited reach will affect Twitter Blue subscribers, who pay for extended reach on tweets.)
The Post interviewed several former board members, some of whom remained anonymous out of fear that Musk might signal many of his followers to target harassment. That’s what happened when Musk responded to a tweet from board members who resigned, accusing him of a “criminal” failure to act on child exploitation for “years.” This, The Post reported, unleashed “a wave of threats and harassment.”
An organization that previously engaged with the board, the Center for Democracy and Technology, released a statement accusing Musk of “reckless actions to spread misinformation about the board, putting board members at risk and eroding any semblance of trust in the company.”
Concerns about retaliation have been validated by other recent reports. Twitter’s former head of trust and security, Yoel Roth, was in a similarly uncomfortable position when Musk tweeted to spread the rumor that Roth had done doctoral research advocating child sexual exploitation. In his tweet, Musk was either misreading or deliberately mischaracterizing it Internet security analysis where Roth tracked data on the risks to young users surfing Grindr. A person close to Roth he told CNN yesterday after Musk started pushing the pederasty conspiracy theory, the threats Roth received on Twitter “exponentially escalated” – forcing him and his family to leave his home.
While Musk has made it clear that blocking child sexual abuse material on Twitter is a priority, disbanding the board risks alienating Musk. NCMEC Vice President Gary Portnoy told The Post that his organization will watch how Twitter restructures as they “continue to promote reporting to CyberTipline and hope to continue to have a seat at the table to address child safety on Twitter.”
Patricia Cartes, a former Twitter employee who helped form the council in 2016, he tweeted Musk to explain the benefits he could forgo in dissolving the council. According to Cartes, the board’s global team helped Twitter “receive” the views of security experts “across all geographies and time zones” and identify “edge cases that helped advance security policy.”
“Their constructive criticism objectively made us better and fostered a safer environment on Twitter,” Cartes said in a tweet. “Without their input and a structure to capture their expertise, it’s unclear how Twitter Safety will be handled and what the checks and balances will look like.”
Twitter did not respond to Ars’ request for comment.
#Twitter #shuns #Trust #Safety #Council #Musk #incites #harassment