APP Users: If unable to download, please re-install our APP.
Only logged in User can create notes
Only logged in User can create notes

General Studies 2 >> Social Justice

audio may take few seconds to load

CHILD SEXUAL ABUSE MATERIAL (CSAM)

CHILD SEXUAL ABUSE MATERIAL

Source: indianexpress

1. Background

  • Child Sexual Abuse Material (CSAM) has different legal definitions in different countries. 
  • The minimum defines CSAM as imagery or videos which show a person who is a child and engaged in or is depicted as being engaged in explicit sexual activity.

2. Child Sexual Abuse Material (CSAM) vs. child pornography

Sometimes CSAM is referred to as child pornography. However, the term “child pornography” should be avoided for the following reasons:

  • The term child pornography fails to describe the true nature of the material and undermines the seriousness of the abuse from the child’s perspective.
  • Pornography is a term primarily used to describe material depicting adults engaged in consensual sexual acts distributed for sexual pleasure. Using this term in the context of children risks normalizing, trivializing and even legitimizing the sexual abuse and exploitation of children.
  • Child pornography implies consent, and a child cannot legally give consent.

3. Aid of Technology in CSAM Scanning

, every time Google detects an image potentially identified as CSAM, it is assigned a 
  • Search Engine relies on“automated detection and human review, in addition to relying on reports submitted by users and third parties,such as NGOs, to det ect, remove, and report CSAM on search platforms' '. 
  • This applies to photos, videos,and files users might upload on Drive, Photos, etc. 
  • The first technology is hash matching, and this ithe includes YouTube’s CSAI (Child Sexual Abuse Imagery) match technology. 
  • CSAI Match is technology deployed on YouTube to fight videos of child abuse, and can spot “re-uploads of previously identified child sexual abuse material and in videos”, according to Google.
  • This particular API can also be used by other NGOs, companies to find CSAM by matching it against Google’s databases. 
  • Google Made This Technology available for free to NGOs and its industry partners at the time of the announcement. 
  • Google also notes that content identified as CSAM by its machine learning technology is “then confirmed by our specialist review teams”. 
  • It is not clear what technology was used to identify CSAM in the reported cases. Google Has Not revealed the accuracy of this AI technology.

4. The issue of Child Sexual Abuse imagery in the Context of ASIA.

  • New research has placed India at the top of the list of countries from where the maximum number of reports (38.8 lakh) related to suspected online child sexual abuse imagery (CSAI) originated. Of the over 2.3 crore reports available with the United States-based National Center for Missing and Exploited Children (NCMEC) from 1998 to 2017, India, Indonesia and Thailand account for 37%.In terms of volume of reports per 1,000 estimated internet users for each country, however, the top three countries involved in CSAI are Iraq, Thailand and Somalia. While the volume of reports in Thailand is 63.8 per 1,000, the number for India at 11.9 per 1,000 is much lower.
  • The results illustrate that CSAI has grown exponentially globally — to nearly 1 million detected events per month — exceeding the capabilities of independent clearing houses and law enforcement agencies to take action. Of the over 2.3 crore reports of suspected incidents of CSAI, almost a crore or 40% occurred in 2017 alone. That’s an exponential rise from the 5.7 lakh reports NCMEC received in its first ten years of operation.
  • The research carried by Google and "Thorn" - an international anti-human trafficking organisation in collaboration with (NCMEC)—the United States clearing house for all CSAI content detected by the public and online sharing platforms - puts out data on the top 10 countries flagged in reports as locations of IP addresses of victims and abusive entities. NCMEC is also seen as the largest clearing house in the world the research underlines to justify the reliance on this database.
  • In the list of the top 10 countries in terms of absolute numbers of reported events, Indonesia is in second place with 17.4 lakh followed by Thailand, Mexico, Bangladesh, the United States, Brazil, Vietnam, Algeria and Pakistan. However, the volume of reports per 1,000 estimated internet users varies widely within these ten from 63.8 for Thailand, 41.8 for Algeria and 40.5 for Bangladesh to 11.9 for India, 6.0 for Brazil and 3.3 for the US.
  • The researchers cautioned that as information is often based on the IP address for abusive entities, it may include VPNs or proxies that skew geographic distributions and hence not too much should be read into the rankings.
  • “The data shows that ten years ago, 70% of CSAI reports reflected abuse in the Americas. Today, 68% of reports relate to abuse in Asia, 19% in the Americas, 6% in Europe, and 7% in Africa. Also new CSAI content is constantly emerging. 84% of detected CSAI images and 91% of videos are reported only once,” the study observed.
  • It concludes that “the internet has outpaced the capabilities of independent clearing house analysts and law enforcement to respond. It is suggested that this exponential growth and the frequency of unique images requires re-imagining CSAI defences away from fingerprint-based detection and manual review. Instead, we argued that researchers need to develop algorithms that automatically detect CSAI content, cluster images and videos of victims, and ultimately surface identifying features to help law enforcement”.

Share to Social