Social media firms could be banned if they fail to protect children from suicide and self-harm material, the Health Secretary has warned.
Matt Hancock said Parliament had the power to block access to the social networks and that the Government “must act” if tech companies failed to purge their sites of such images and videos.
His comments come as a Daily Telegraph investigation has found Google promoting graphic suicide manuals high up in its search results and YouTube has allowed advertising on videos showing suicide methods.
Both companies feature the Samaritans’ telephone number, and YouTube has since pulled advertising from the clips.
Speaking on The Andrew Marr Show on Sunday, Mr Hancock said: “If we think they [social media companies] need to do things that they are refusing to do, then we can and we must legislate.
“We are masters of our own fate as a nation, and we must act to ensure that this amazing technology is used for good, not leading to young girls taking their own lives.
“Ultimately Parliament does have that sanction. It is not where I would like to end up in terms of banning them. Of course, there is a great positive to social media too.”
Over the weekend Mr Hancock wrote to social media companies telling them they need to act “urgently” over suicide content, just days after a father said images on Instagram “helped to kill” his 14-year-old daughter.
In 2017, Molly Russell was found dead in her bedroom after showing “no obvious signs” of mental health issues.
However, her family later found she had been viewing material on Instagram that was linked to anxiety, depression, self-harm and suicide.
Her father, Ian Russell, has called for social media companies to “take responsibility” for the content they are making available to young people.
The Government is currently drawing up a white paper on protecting children from online harm, which is expected to bring in new regulations for the social media and tech companies.
The Telegraph is campaigning for a statutory duty of care to be placed on tech companies to ensure children are properly protected online.
A spokesman for Instagram said: “Our thoughts go out to Molly’s family and anyone dealing with the issues raised in this report.”