Top Headlines

Feeds

French Parliament Passes Under‑15 Social‑Media Ban, Set for September Implementation

Updated (36 articles)

Parliament Approves Ban with Overwhelming Majority The National Assembly voted 130‑21 to prohibit users under 15 from accessing platforms such as TikTok, Instagram and Snapchat, with the measure slated to take effect at the start of the September school year [1][2]. The same text also bans mobile‑phone use in senior high schools (lycées), a provision Macron urged lawmakers to fast‑track through a single‑reading procedure [1][3]. Senate review is pending, but the government expects the accelerated process to meet the September deadline [2][4].

Dual‑List Regulator System Requires Parental Consent for Some Sites France’s media regulator will maintain two official lists: one of networks automatically blocked for minors and a second of less‑harmful services that can be accessed only with explicit parental approval [2][4]. This approach mirrors age‑verification mechanisms used for online pornography and fulfills obligations under the EU Digital Services Act [1][2]. Lawmakers anticipate that platforms will need to implement robust age‑assurance technologies to comply [2].

Health Watchdog Report Links Heavy Use to Mental‑Health Risks December health‑watchdog study found that 50 % of teenagers spend two to five hours daily on smartphones, 90 % of 12‑17‑year‑olds use them every day, and 58 % engage with social networks [1][4]. The report associated this usage with reduced self‑esteem, exposure to self‑harm, drug‑use and suicide‑related content, and noted a surge in family lawsuits against TikTok over alleged teen suicides [1][4]. These findings underpin the government’s claim that children’s “brains are not for sale” to foreign platforms [1][3].

France Joins Global Wave of Youth‑Protection Measures The French initiative follows Australia’s December 2025 under‑16 ban, which deactivated roughly 4.7 million child accounts [3][4][6][7]. The UK has launched a three‑month consultation on a similar under‑16 restriction, citing the Australian precedent [8]. Macron highlighted the need to shield youths from American and Chinese algorithmic influence, positioning the law within a broader EU push for a minimum age of 16 under the Digital Services Act [1][2].

Industry Adjusts Amid Legal Pressure and Parallel Safeguards Ahead of a U.S. federal trial on child‑safety claims, Meta announced a temporary block on AI‑generated characters for all teen users on Instagram and WhatsApp, while retaining its standard AI assistant [5]. This move aligns with a wider trend of tech firms restricting generative‑AI tools for minors after lawsuits alleging chatbot‑induced harm [5]. The French ban therefore occurs in a climate of increasing platform self‑regulation and governmental oversight worldwide.

Sources (8 articles)

Timeline

Nov 20, 2025 – Australia becomes the first country to enact a nationwide under‑16 social‑media ban, targeting ten major platforms and citing a government‑commissioned study that finds 96 % of 10‑15‑year‑olds use social media, with seven‑in‑ten exposed to harmful content[3].

Dec 4, 2025 – Meta notifies users aged 13‑15 that their Instagram, Facebook and Threads accounts will be shut down beginning Dec 4, pre‑empting the Australian law and offering a content‑download window before deactivation[24].

Dec 8, 2025 – The Australian eSafety Commissioner announces the ban will take effect on Dec 10, outlines fines of up to A$49.5 million for non‑compliance and explains the regulator’s role in issuing information notices to the ten platforms[22].

Dec 9, 2025 – Platforms roll out age‑verification measures as the Dec 10 deadline approaches; X (formerly Twitter) states it will not comply, while other services pledge to block or suspend under‑16 accounts[27].

Dec 10, 2025 – The ban goes live nationwide; Prime Minister Anthony Albanese calls the day “proud” and stresses that the measure puts families “back in control” while imposing penalties of up to A$49.5 million for breaches[20].

Dec 10, 2025 – Two 15‑year‑old Australians file a High Court challenge, arguing the ban infringes the implied freedom of political communication; the cases are scheduled for a directions hearing in February 2026[16].

Jan 9, 2026 – One month after the ban, 14‑year‑old Amy says she feels “freer” and uses her phone only when necessary, noting that the loss of Snapchat helped her cope with the Bondi Beach shooting; psychologists warn of short‑term irritability as teens adjust[10].

Jan 12, 2026 – Meta reports blocking roughly 550,000 Australian accounts in the first week of compliance—330,639 on Instagram, 173,497 on Facebook and 39,916 on Threads—while urging age checks be handled at the app‑store level and calling for parental‑exemption provisions[5].

Jan 14, 2026 – South Korea’s Korea Media and Communications Commission (KMCC) asks X to implement safeguards that prevent sexual deep‑fake content from its Grok AI model and to appoint a minor‑protection officer, citing existing criminal penalties for such material[28].

Jan 16, 2026 – Communications Minister Anika Wells declares the Australian enforcement a “proven effort,” confirming that 4.7 million child accounts have been deactivated since the ban and emphasizing that the policy gives parents confidence in online safety[9].

Jan 20, 2026 – The UK government launches a three‑month public consultation on an under‑16 social‑media ban, granting Ofsted new powers to police school phone policies; more than 60 Labour MPs back the ban while Conservative leader Kemi Badenoch criticises the approach as “dithering”[4].

Jan 23, 2026 – Meta temporarily disables AI‑generated “characters” on Instagram and WhatsApp for all users flagged as minors, a move timed a week before Meta, TikTok and YouTube appear in a Los Angeles federal court trial over alleged harms to children[8].

Jan 23, 2026 – eSafety Commissioner Julie Inman Grant tells the BBC that social‑media firms are complying “kicking and screaming,” underscoring industry reluctance to meet Australia’s under‑16 ban requirements[2].

Jan 25, 2026 – President Emmanuel Macron urges an accelerated fast‑track for France’s under‑15 social‑media ban, declaring that “youths’ minds are not for sale … to American platforms nor to Chinese networks” and linking the proposal to a health‑watchdog report on teen smartphone harms[7].

Jan 26, 2026 – France’s National Assembly votes to adopt the core provisions of the under‑15 ban, including a dual‑list regulator system and a fast‑track procedure aimed at enacting the law before the September school start[1].

Jan 27, 2026 – The French parliament passes the under‑15 ban by a 130‑21 vote; Macron cites scientific consensus and warns that children’s brains “are not for sale,” while the law also bans mobile‑phone use in senior high schools and aligns with the EU Digital Services Act[6].

Sep 2026 (planned) – France intends to bring the under‑15 social‑media prohibition into force at the start of the 2026‑27 school year, pending Senate approval and fast‑track processing[1].

Apr 2026 (expected) – The UK’s three‑month consultation on an under‑16 ban is slated to close, after which the government will decide whether to introduce age‑verification requirements and broader phone‑free‑by‑default policies in schools[4].

Late Jan 2026 (upcoming) – A federal child‑safety trial in Los Angeles, scheduled for the week after Jan 23, will hear testimony from Meta, TikTok and YouTube on alleged harms to children, potentially influencing further regulatory action worldwide[8].

Stories about this story (6 stories)

Social media (1 posts)

All related articles (36 articles)

External resources (18 links)