After British teenager Molly Russell committed suicide in 2017, the finger of blame was quickly pointed towards Instagram. The 14-year-old, who had shown no obvious signs of mental health issues, had been viewing images of suicide on the Facebook-owned platform, in another scandal for the embattled company.

Over the weekend, British health secretary Matt Hancock, wrote to Facebook to say he was “appalled” that suicide content was so easy to find on their site. “We can legislate if we need to,” Mr Hancock told The Andrew Marr Show. “We must act to make sure that this amazing technology is used for good, not leading to young girls taking their own lives.”

Hancock’s comments reflect growing global support for policy makers to regulate social media companies. As Facebook’s power has grown, trust in the platform has waned. But founder Mark Zuckerberg wants to avoid different countries imposing different laws. Instead, he is trying to appease regulators by creating Facebook’s own legal system that would hold the platform accountable.

Zuckerberg has been floating this idea for months. Speaking in April, he outlined his vision for an independent appeals board that would oversee decisions on content policy: “You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don’t work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.”

Then on Monday, the day after Hancock’s comments, the social network released new details of its supreme court idea in a draft charter for an external board. The document included details on how the board would be able to reverse Facebook’s decisions about whether to remove certain posts from the site.

For the past few years, Facebook has struggled with a flurry of content issues. The site has come under fire for allowing users to live stream suicide and for hosting hate speech that has been linked to ethnic violence.

Until now, the company has been using a combination of artificial intelligence and an army of human moderators to take down content that violates its community standards. But neither method has enabled Facebook to escape controversy.

In a November blogpost, Zuckerberg summarised his ongoing dilemma: “An important question we face is how to balance the ideal of giving everyone a voice with the realities of keeping people safe and bringing people together. What should be the limits to what people can express?”

By creating the board, Facebook will be hoping to absolve itself of some responsibility when it comes to difficult issues. New details released on Monday explained how the board would include 40 global experts with experience in content, privacy, free expression, human rights, journalism, civil rights and safety. Members will serve part-time for three years, with their terms only able to be renewed once. To ensure independent judgement, they cannot be current or former Facebook employees and they cannot be lobbied.

Facebook will refer decisions to the board when cases are particularly difficult to resolve or when an issue attracts significant public debate. Board members will be able to consult geographic and cultural experts to make sure their decision can be applied to the two billion people who use the platform globally.

This is untested ground and over the next six months, Facebook would be consulting a range of experts to decide what membership of this new board should look like and the scope of its powers.

But Wired journalist Issie Lapowsky warned of issues ahead: “To compare Facebook's board to the Supreme Court is to minimize the sheer complexity of what Facebook is setting out to accomplish.” The board would choose from several million cases each week and its decision would affect a population seven times larger than the size of the United States.