दिल्ली उच्च न्यायालय ने Google और Microsoft को NCII छवियाँ हटाने के आदेश को चुनौती देने का निर्देश दिया

दिल्ली उच्च न्यायालय ने Google और Microsoft को NCII छवियाँ हटाने के आदेश को चुनौती देने का निर्देश दिया

Delhi High Court directs Google and Microsoft to challenge NCII images removal order

The Delhi HC directs Google and Microsoft to review an order on removing non-consensual intimate images, citing tech challenges in identifying and taking down such content.

  • Global News
  • 429
  • 15, May, 2024
Author Default Profile Image
Tamanna Varshney
  • @TamannaVarshney

Delhi High Court directs Google and Microsoft to challenge NCII images removal order

The Delhi High Court has recently directed Google and Microsoft to file a review petition to reconsider a previous order requiring search engines to swiftly restrict access to non-consensual intimate images (NCII) without the need for victims to repeatedly provide specific URLs. This directive comes amid arguments from both tech giants about the technological challenges associated with identifying and proactively removing NCII images, even with the aid of artificial intelligence (AI) tools.

This legal tussle originates from a 2023 ruling that mandated search engines to take down NCII content within 24 hours, in accordance with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The ruling stipulated that failure to comply could result in search engines losing their safe harbour protections under Section 79 of the IT Act, 2000. To streamline the process and reduce the burden on victims, the court proposed a system whereby a unique token would be issued upon the initial takedown request. Search engines would then be responsible for ensuring that any resurfaced content is promptly disabled using existing technology, thereby alleviating the need for victims to continuously track and report specific URLs.

The court also suggested leveraging hash-matching technology, which creates unique digital fingerprints for images, to help in the identification and removal of NCII. Furthermore, it recommended the development of a 'trusted third-party encrypted platform' where victims could register NCII content or URLs. This platform would shift the responsibility of identifying and removing resurfaced content from the victims to the search engines, while also maintaining high standards of transparency and accountability.

However, both Google and Microsoft have raised significant concerns regarding the feasibility and implications of such measures. Google pointed out the challenges automated tools face in discerning consent in shared sexual content. This difficulty can lead to unintended takedowns, which not only infringe on free speech but also risk the removal of consensual content mistakenly flagged as NCII. Similarly, Microsoft expressed apprehension about the broader implications of proactive monitoring, particularly concerning privacy and freedom of expression. The company highlighted the potential for overreach and the unintended consequences of stringent content monitoring.

These concerns underscore the delicate balance between protecting individuals from NCII and preserving fundamental rights such as privacy and freedom of expression. The technological limitations in automatically detecting and distinguishing between consensual and non-consensual intimate content add another layer of complexity. Despite advancements in AI and machine learning, these tools are not foolproof and can misidentify content, leading to both over-blocking and under-blocking scenarios.

In light of these challenges, the court's directive for Google and Microsoft to file a review petition reflects an acknowledgment of the need for a more nuanced approach. It opens the door for further dialogue and potential adjustments to the existing legal framework to better address the technological and ethical concerns raised by the tech giants.

The proposed solutions, including the use of hash-matching technology and a trusted third-party platform, represent innovative steps towards reducing the burden on victims and enhancing the efficacy of NCII takedown processes. However, their implementation must be carefully calibrated to ensure they do not inadvertently compromise the very rights they aim to protect.

As this case progresses, it will likely serve as a critical reference point for the ongoing global discourse on balancing digital rights and responsibilities. The outcome could have far-reaching implications for how tech companies handle sensitive content and protect users' rights in an increasingly digital world. The evolving legal and technological landscape will necessitate continuous adaptation and collaboration between courts, tech companies, and civil society to safeguard individuals while upholding essential freedoms.

News Reference

Author Default Profile Image

Tamanna Varshney

  • @TamannaVarshney