Social media executives from Meta, Snap, YouTube, TikTok and X are being summoned to Downing Street on Thursday for a crucial meeting with Prime Minister Sir Keir Starmer and Technology Secretary Liz Kendall over online safety for children. The tech bosses will be questioned about what measures they are taking to safeguard young people and respond to parent worries, as the government continues its review on whether to introduce an outright ban on social media for under-16s, following Australia’s lead. Sir Keir has stressed that the meeting will centre on ensuring “social media companies step up and take responsibility”, warning that “the consequences of not taking action are severe” and that the government has a duty to parents and the next generation to put children’s safety first.
The Downing Street Confrontation
Thursday’s gathering represents a pivotal moment in the government’s push to hold tech giants accountable for their role in protecting vulnerable young users. The gathering comes at a crucial juncture, with Parliament having rejected calls for an outright ban on social media for those under 16 just hours earlier, despite support from the House of Lords. Instead of introducing a blanket prohibition, MPs chose to give ministers authority to establish their own restrictions, indicating the government’s preference for a increasingly tailored regulatory approach rather than a sweeping legislative ban.
The pace of the Downing Street summit demonstrates the government’s commitment to appear decisive on internet safety whilst navigating intricate political and commercial pressures. Professor Gina Neff from the University of Cambridge’s Minderby Centre for Technology and Democracy suggested the summit permits the administration to demonstrate it is taking the initiative on online harms. Downing Street has already acknowledged that some platforms have advanced, deploying measures such as turning off autoplay for children by default, and giving parents enhanced controls over device usage, though critics argue substantially more must be completed.
- Tech executives questioned on safeguarding measures and how they address parent worries
- Ministers considering restrictions on social media for under-16s drawing from Australia’s example
- MPs dismissed full ban but granted ministers powers to establish limitations
- Some services already introduced measures like turning off autoplay for children
Parliamentary Rejection and the Wider Discussion
Wednesday evening’s parliamentary vote dealt a significant blow to supporters of a comprehensive social media ban for those under 16, marking the second occasion MPs have dismissed such proposals despite strong support from the House of Lords. The administration’s choice to prioritise ministerial flexibility over formal legislation reflects a more conservative strategy, with officials contending that an outright ban would be premature given ongoing policy considerations. This approach allows the government flexibility in crafting bespoke restrictions rather than implementing a blanket prohibition that some fear could prove difficult to enforce and effectively oversee across multiple platforms.
The rejection has heightened debate about whether the UK is sufficiently safeguarding its young people from internet-based threats. Whilst the administration argues that giving ministers authority to implement bespoke guidelines represents a more pragmatic solution, critics argue this approach falls short of decisive measures the situation requires. Recent evidence from Australia, where an ban on social media for under-16s was established in December 2025, reveals that approximately 60 per cent of underage users persist in using platforms even so, raising serious questions about the effectiveness of legislative bans and suggesting the challenge extends far beyond simple prohibition.
Cross-Party Criticism
The parliamentary decision has attracted sharp opposition from opposition benches. Conservative shadow education secretary Laura Trott criticised Labour MPs of letting down parents and children by rejecting the ban, arguing that other nations are recognising social media’s harms whilst the UK drops back under the current government. Liberal Democrat education spokeswoman Munira Wilson reinforced these concerns, declaring that “the time for half-measures is over” and demanding immediate measures to restrict the most damaging platforms for young users rather than incremental regulatory adjustments.
Australia’s Cautionary Tale
Australia’s experience with online platform restrictions offers a cautionary case study for policymakers evaluating comparable approaches in the UK. When the country introduced a prohibition on social media for under-16s in December 2025, it was celebrated as a landmark step in protecting young users from online harms. However, new findings from the Molly Rose Foundation has revealed a concerning reality: more than 60 per cent of young Australians continue using online platforms despite the legislative prohibition. This substantial rate of non-compliance indicates that legal prohibitions alone could be insufficient in stopping young users intent on access from using the services they wish to use.
The Australian findings hold considerable implications for the UK’s continuing policy deliberations. If a similar ban were introduced in Britain, the evidence indicates implementation would present substantial challenges, with young people probably finding ways to circumvent age-verification systems and restrictions through multiple technical means. The data undermines arguments that a simple legislative prohibition represents a quick fix to online safety concerns, instead pointing towards the need for a broader approach integrating regulatory measures, platform responsibility, parental oversight tools, and digital literacy training to meaningfully address the risks young people face online.
| Key Finding | Implication |
|---|---|
| Over 60% of underage Australians still access social media despite ban | Legislative prohibitions alone cannot effectively prevent determined young users from accessing platforms |
| Ban introduced in December 2025 has failed to achieve widespread compliance | Enforcement mechanisms remain weak and young people find workarounds to restrictions |
| Blanket bans do not address underlying appeal of social media to young people | Multi-faceted approach combining regulation, platform accountability, and education is necessary |
Industry Professionals Push for Substantive Measures
Child safety advocates and digital rights experts have intensified calls for tech companies to take concrete steps past self-regulation. The Molly Rose Foundation, established in memory of 14-year-old Molly Russell who died by suicide after accessing dangerous material on the internet, has been particularly vocal in demanding systemic change. Rather than implementing sweeping prohibitions that prove difficult to enforce, campaigners argue the focus must shift towards holding platforms accountable for the algorithms that promote dangerous material to vulnerable users.
Andy Burrows, chief executive of the Molly Rose Foundation, has stressed that Thursday’s Downing Street meeting constitutes a critical moment for government action. The charity has repeatedly maintained that social media companies have the technical capability to introduce strong protections, yet frequently place user engagement figures over user wellbeing. Experts emphasise that real safeguarding requires platforms to overhaul their recommendation systems, enhance content moderation, and offer parents with meaningful tools to monitor their children’s online activity successfully.
The Algorithm Problem
At the heart of concerns sits the algorithmic systems that control what content younger audiences see. These algorithms are designed to boost user engagement, often pushing sensational, harmful, or addictive content to at-risk groups. Overhauling these mechanisms represents one of the most critical issues in online safety, requiring platform transparency about how their recommendation engines operate and what safeguards exist.
- Algorithms emphasise engagement over user wellbeing and safety
- Platforms need to improve transparency about algorithmic recommendation processes
- Independent audits of algorithmic damage are essential for ensuring accountability
What’s Coming Next
Thursday’s summit at Downing Street will set the tone for the government’s stance on online child safety in the period ahead. Following the meeting, Sir Keir Starmer and Liz Kendall are anticipated to outline their results and determine whether current voluntary schemes from tech companies are adequate or whether enhanced statutory intervention becomes necessary. The government remains midway through its consultation process on whether to implement an Australia-style ban on social media for under-16s, with the conclusions from this week’s talks likely to shape the final policy direction.
Ministers have expressed their preference for giving themselves powers to place limitations rather than introducing a complete prohibition, citing anxieties over practical implementation and results. However, growing pressure from opposition MPs, child safety groups, and parents suggests the government may come under sustained pressure for more decisive action. The next few weeks will prove crucial in determining whether technology firms can show real commitment to safeguarding young people or whether the government will introduce new laws to enforce compliance with more stringent safety standards.