Meta, YouTube found liable for ‘addiction’ in the US. Could it have implications in India?
360° Perspective Analysis
Deep-dive into Geography, Polity, Economy, History, Environment & Social dimensions — AI-powered, on-demand
Context
A landmark verdict in Los Angeles has held Meta and YouTube liable for causing mental health harm to a minor through addictive product design. This ruling is globally significant as it challenges the liability protections of tech platforms and coincides with worldwide scrutiny of social media's impact on children. It holds particular relevance for India, where policymakers are actively deliberating on a regulatory framework to protect minors from online harms like addiction and exposure to harmful content.
UPSC Perspectives
Polity & Governance
The US verdict amplifies the debate on the adequacy of India's regulatory architecture for online child safety. India has a multi-pronged legal framework, but faces implementation challenges. Key legislations include: [Digital Personal Data Protection Act, 2023]: This Act imposes strict obligations on platforms (Data Fiduciaries). It mandates obtaining verifiable parental consent for processing data of users under 18 and prohibits tracking, behavioural monitoring, or targeted advertising directed at children. [Information Technology Act, 2000] and IT Rules, 2021: These form the backbone of cyber law. The IT Act criminalizes publishing or transmitting child sexual abuse material (CSAM) under Section 67B. The IT Rules 2021 require streaming platforms to classify content by age and implement parental locks. [Protection of Children from Sexual Offences Act, 2012] (POCSO): This is the primary law against child sexual abuse, and its provisions are interpreted to include online offences like sexual harassment and distributing pornographic content involving children. [Bharatiya Nyaya Sanhita, 2023]: The BNS, which replaces the IPC, modernizes the criminal code to better address digital offences, including stalking and harassment of children online. [Juvenile Justice (Care and Protection of Children) Act, 2015]: This act provides a framework for children in need of care and protection, which can extend to victims of online exploitation. Despite this comprehensive framework, challenges like weak age-verification mechanisms, uneven enforcement, and limited digital forensic capacity persist. UPSC may ask candidates to critically analyze the efficacy of these laws and suggest measures for a more robust safe harbour* regime that balances innovation with child protection.
Social
The case highlights the growing social crisis of digital addiction and its severe impact on the mental health of children. Features like infinite scrolling and algorithmic recommendations are identified as 'addictive by design', engineered to maximize user engagement at the cost of well-being. This raises fundamental questions about Child Rights in the digital era. For children, excessive and unmonitored screen time is linked to anxiety, depression, social isolation, and exposure to harmful content and cyberbullying. The vulnerability of minors, whose cognitive abilities to self-regulate are still developing, makes them particularly susceptible to such manipulation. The debate in India must move beyond just data protection to encompass the broader concept of digital well-being. This involves fostering a digital environment that supports a child's development, as enshrined in the (UNCRC), which India has ratified. A potential UPSC question could be: "In the context of the 'attention economy', discuss the social and ethical implications of addictive technology on child development and suggest a multi-stakeholder strategy to promote digital well-being among minors."
Governance
This issue brings into focus the principles of corporate governance and ethical responsibility for technology companies, particularly 'Significant Social Media Intermediaries'. The US verdict signifies a shift towards holding platforms accountable not just for content but also for the architectural design of their products. This challenges the traditional defense of platforms as mere conduits. The article mentions Australia's move to ban social media for children under 16, representing a stringent state-led regulatory model. This contrasts with approaches that emphasize corporate self-regulation and user empowerment through parental controls. In India, under the IT Rules, intermediaries have a due diligence obligation. However, the problem of age misrepresentation remains a significant loophole, allowing children to bypass age-gating mechanisms. Effective governance requires a hybrid approach: strengthening state regulation, mandating robust and privacy-preserving age verification technologies, and enforcing a stringent code of ethics for digital product design. A key governance challenge is balancing child safety with the right to information and privacy. UPSC could frame a case study on the dilemma faced by a regulator in choosing between an outright ban (like Australia's) and a graded, rights-based approach to providing children with safe online access.