Whatsapp 98103-86285 For Details

Important Editorial Summary for UPSC Exam

22 Oct
2022

An online fight where children need to be saved (GS Paper 1, Social Issues)

An online fight where children need to be saved (GS Paper 1, Social Issues)

Context:

  • Recently, the Central Bureau of Investigation (CBI) conducted searches across States and Union Territories as part of a pan-India operation, “Megh Chakra”.
  • The operation, against the online circulation and sharing of Child Sexual Abusive Material (CSAM) using cloud-based storage, was supposedly based on inputs received from Interpol’s Singapore special unit, in turn based on the information received from New Zealand.

 

Circulation of online CSAM in India:

  • In November 2021, a similar exercise code-named “Operation Carbon”was launched by the CBI, with many being booked under the IT Act, 2000.
  • In India, though viewing adult pornography in private is not an offence;seeking, browsing, downloading or exchanging child pornography is an offence punishable under the IT Act.
  • However, Internet Service Providers (ISPs) are exempted from liability for any third-party data if they do not initiate the transmission.
  • As the public reporting of circulation of online CSAM is very low and there is no system of automatic electronic monitoring, India’s enforcement agencies are largely dependent on foreign agencies for the requisite information.

 

American and British models:

  • The National Center for Missing & Exploited Children (NCMEC), a non-profit organisation in the United States, operates a programme called CyberTipline, for public and electronic service providers (ESPs) to report instances of suspected child sexual exploitation. ISPs are mandated to report the identity and the location of individuals suspected of violating the law.
  • Also, NCMEC may notify ISPs to block transmission of online CSAM. In 2021, the CyberTipline received more than 29.3 million reports (99% from ESPs) of U.S. hosted and suspected CSAM.
  • In the United Kingdom, the mission of the Internet Watch Foundation (IWF), a non-profit organisation established by the United Kingdom’s Internet industry to ensure a safe online environment for users with a particular focus on CSAM, includes disrupting the availability of CSAM and deleting such content hosted in the U.K.
  • The IWF engages the analysts to actively search for criminal content and not just rely on reports from external sources.
  • Though the U.K. does not explicitly mandate the reporting of suspected CSAM, ISPs may be held responsible for third party content if they hosts or caches such content on their servers. In 2021, the IWF assessed 3,61,062 reports, (about 70% reports had CSAM) and seven in 10 reports contained “self-generated” CSAM.

 

Global network:

  • INHOPE, a global network of 50 hotlines (46 member countries), provides the public with a way to anonymously report CSAM.
  • It provides secure IT infrastructure, ICCAM (I- “See” (c)-Child-Abuse-Material) hosted by Interpol, and facilitates the exchange of CSAM reports between hotlines and law enforcement agencies. ICCAM is a tool to facilitate image/video hashing/fingerprinting and reduce the number of duplicate investigations.

 

India’s efforts so far

Supreme Court of India:

  • In India, the Supreme Court of India, in Shreya Singhal (2015), read down Section 79(3)(b) of the IT Act to mean that the ISP, only upon receiving actual knowledge of the court order or on being notified by the appropriate government, shall remove or disable access to illegal contents. Thus, ISPs are exempted from the liability of any third-party information.
  • In the KamleshVaswani (WP(C) 177/2013) case, the petitioner sought a complete ban on pornography. After the Court’s intervention, the advisory committee (constituted under Section 88 of the IT Act) issued orders in March 2015 to ISPs to disable nine (domain) URLs which hosted contents in violation of the morality and decency clause of Article 19(2) of the Constitution. The petition is still pending in the Supreme Court.

 

‘Aarambh India’:

  • ‘Aarambh India’, a Mumbai-based non-governmental organisation, partnered with the IWF, and launched India’s first online reporting portal in September 2016 to report images and videos of child abuse.
  • These reports are assessed by the expert team of IWF analysts and offending URLs are added to its blocking list.

 

National cybercrime reporting portal:

  • The Ministry of Home Affairs (MHA) launched a national cybercrime reporting portal in September 2018 for filing online complaints pertaining to child pornography and rape-gang rape.
  • This facility was developed in compliance with Supreme Court directions with regard to a public interest litigation filed by Prajwala, a Hyderabad-based NGO that rescues and rehabilitates sex trafficking survivors.
  • As not many cases of child porn and rape were reported, the portal was later extended to all types of cybercrime.
  • Further, the National Crime Records Bureau (MHA) signed a memorandum of understanding with the NCMEC in April 2019 to receive CyberTipline reports to facilitate action against those who upload or share CSAM in India. The NCRB has received more than two million CyberTiplinereports which have been forwarded to the States for legal action.

 

Ad hoc Committee of the Rajya Sabha:

  • The ad hoc Committee of the Rajya Sabha, headed by Jairam Ramesh, in its report of January 2020, made wide-ranging recommendations on ‘the alarming issue of pornography on social media and its effect on children and society as whole’.
  • On the legislative front, the committee not only recommended the widening of the definition of ‘child pornography’ but also proactive monitoring, mandatory reporting and taking down or blocking CSAM by ISPs.
  • On the technical front, the committee recommended permitting the breaking of end-to-end encryption, building partnership with industry to develop tools using artificial intelligence for dark-web investigations, tracing identity of users engaged in crypto currency transactions to purchase child pornography online and liaisoning with financial service companies to prevent online payments for purchasing child pornography.

 

What needs to be done?

  • According to the ninth edition (2018) report of the International Centre for Missing and Exploited Children on “Child Sexual Abusive Material: Model Legislation & Global Review”, more than 30 countries now require mandatory reporting of CSAM by ISPs. Surprisingly, India also figures in this list, though, the law does not provide for such mandatory reporting.
  • The Optional Protocol to the United Nations Convention on the Rights of the Child that addresses child sexual exploitation encourages state parties to establish liability of legal persons.
  • Similarly, the Council of Europe’s Convention on Cybercrime and Convention on The Protection of Children against Sexual Exploitation and Sexual Abuse also requires member states to address the issue of corporate liability.

Way Forward:

  • It is time India joins INHOPE and establishes its hotline to utilise Interpol’s secure IT infrastructure or collaborate with ISPs and financial companies by establishing an independent facility such as the IWF or NCMEC.
  • The Jairam Ramesh committee’s recommendations must be followed up in earnest and the Prajwala case brought to a logical end. India needs to explore all options and adopt an appropriate strategy to fight the production and the spread of online CSAM. Children need to be saved.