An Analysis of the Regulation of Children's Online Activities Under the Digital Personal Data Protection Bill, 2022

The DPDP Bill was tabled by the Ministry of Electronics and Information Technology on November 18, 2022, for comments. The purpose of the Bill was to provide for the processing of digital personal data in a manner that recognized both the right of individuals to protect their personal data and the need to process personal data for lawful purposes. Though the object behind the proposed DPDP Bill appears to justify the need of the hour, the DPDP Bill has imposed certain additional obligations with respect to children.

Introduction

The internet has become an indispensable part of modern life. The significance it bears and the impact it has on young minds cannot be overstated. It provides them with access to a vast array of information and resources, including educational content, news, and entertainment. It also allows them to connect with others and form communities, whether it be through social media, gaming, or online forums. The use of the internet in day-to-day affairs of life has considerably grown over the past two decades. The leitmotif of this article is not to regurgitate the importance of the internet but to reflect on the intriguing debate over the regulation of the internet by parents with respect to children under the proposed Digital Personal Data Protection (“DPDP”) Bill, 2022.

The Gordian Knot

Section 10[1] of the proposed DPDP Bill deals with the processing of the personal data of children. The section states that ‘The Data Fiduciary shall, before processing any personal data of a child, obtain verifiable parental consent in such manner as may be prescribed’. Under the Bill, a child is defined as someone who has not completed eighteen years of age[2]. Every time a child creates an account, be it social media, gaming, or an OTT account, the Data Fiduciary[3] involved, which would be the platform providing the service, would necessarily have to secure the consent of the parent or legal guardian of the child before processing their data. The DPDP Bill also prescribes a penalty of up to Rs. 200 crores for its non-compliance[4].

The implications of this proposed section are vast. Currently, most social media platforms including Twitter, Facebook, and Instagram require the user to be above the age of thirteen years to create an account, without any requirement of parental consent. Practically speaking, these platforms do not verify the age as claimed by the user and thus, it is possible to provide incorrect age in order to create an account. The same goes for all other prospective Data Fiduciaries. From knowledge-providing platforms like YouTube and Quora to entertainment or gaming platforms like Spotify and Stream, all these platforms currently have set thirteen years as the minimum age to create an account and enjoy these services. To comply with the DPDP bill, in case it is passed, the platforms would not just have to modify their own terms and conditions for the Indian jurisdiction but also have to come up with a verifiable parental consent requirement mechanism. Since most platforms and websites on the internet require the creation of an account to access the features or services fully, enforcing Section 10 of the DPDP bill would require an entire overhaul of how the internet functions. There would have to be parental consent forms and verification mechanisms in almost all corners of the internet.

While mandating such monitoring of every online activity of a child might sound fit in an average conservative Indian household, it is important to understand that doing so fundamentally alters the very forte of the internet – accessibility to information. Curtailing this would have detrimental effects on any child’s development, by allowing the parents to restrict any chances of the child’s exposure to perspectives that might not agree with their own. This would also be in defiance of Article 13 of the Convention on the Rights of the Child[5], which India had signed and ratified on December 11, 1992. The Article promotes the “right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers” for children.

Untying the Knot

Perhaps one way to mitigate the issues that could arise if the proposed section is brought into effect is by introducing gradation in the age limit that it specifies consent for. In this respect, inspiration can be taken from the Indian Penal Code, 1860,[6] which categorizes children and provides for classification based on age (below 7, from 7 to 12, etc.) to determine the law applicable to them. Even the much popular General Data Protection Regulation, 2016 of the European Union allows member states to lower the age of the child to 13 years to determine if parental consent would be needed or not[7].

The rigidity with respect to parental consent should also be based on a model which considers the evolution and development of children at different ages. France’s model of children’s data privacy rights under the French Data Protection Act, 1978 which was heavily amended recently in 2018, could also be looked at. Article 45 of the said Act[8] introduces the concept of “Joint Consent”. It states that ‘If the child is under 15 years of age, the processing will be lawful only if consent is given jointly by the child and the holder(s) of parental responsibility over that child.’ This, in essence, means that the consent is based on a mutual agreement between the child and the parent(s) holding parental rights. With respect to children above the age of 15 years, the Act allows them to give their own consent.

Conclusion

Thus, while it is ultimately up to the lawmakers to resolve, they must keep in mind the logistical and sociological effects of enforcing mandatory parental regulation on children’s online activities. If not by reducing the age to a more reasonable one, as done by other jurisdictions, systems like gradation in age or joint parental-child consent should be put in place. In the case of Faheema Shirin R.K. vs State of Kerala[9], the Kerala High Court, specifically speaking in the context of students, stated that the right to access the internet forms a part of freedom of speech and expression guaranteed under Article 19(1)(a) of the Constitution. In the said case, it was held that ‘Enforcement of discipline shall not be by blocking the ways and means of the students to acquire knowledge’. The concept of “best interest of the child” which is much popular in custody and guardianship cases and puts the best possible alternative for the child before the rights of the parents, could perhaps be interpreted broadly and acknowledged by the lawmakers with respect to the present debate as well.

References:

[1] Section 10, The Digital Personal Data Protection Bill, 2022.

[2] Defined under Section 2(3), The Digital Personal Data Protection Bill, 2022.

[3] Defined under Section 2(5), The Digital Personal Data Protection Bill, 2022.

[4] Section 25, The Digital Personal Data Protection Bill, 2022.

[5] Article 14, Convention on the Rights of the Child, 1989 [General Assembly resolution 44/25].

[6] Sections 82 and 83, Indian Penal Code, 1860.

[7] Article 8, General Data Protection Regulation, 2016.

[8] Article 45, French Data Protection Act, 1978.

[9] Faheema Shirin R.K vs State of Kerala, 2019 [WP(C)No.19716 OF 2019(L)].

Image Credits:

Photo by Pavel Danilyuk: https://www.pexels.com/photo/woman-using-a-laptop-with-her-daughter-7055153/

While it is ultimately up to the lawmakers to resolve, they must keep in mind the logistical and sociological effects of enforcing mandatory parental regulation on children’s online activities. If not by reducing the age to a more reasonable one, as done by other jurisdictions, systems like gradation in age or joint parental-child consent should be put in place.

POST A COMMENT

Card Tokenisation: Plugging Personal Information Leaks

Plastic money still captures a large portion of the market share despite the growing use of the Unified Payment Interface (UPI).  Recent data released by the Reserve Bank of India (RBI) indicates that there has been an increase of 16.3% year after year in the usage of debit and credit cards by Indian consumers in the last decade.

Nevertheless, this decade marked a shift to digital technology, augmented by governmental decisions and policies such as demonetisation, the introduction of UPI, and Digital India program, etc. that enabled Indian consumers to make a smooth shift to online payment solutions. The pandemic has also played a big role in this revolution. With face-to-face interaction minimized, the focus on digital products and payments skyrocketed.

Digital transactions are now considered the most sought-after payment mechanism in comparison to hard cash or currency for availing services and goods. As the number of transactions made through a mobile application or platform increases, customers usually prefer to save their card information on the merchant’s site or platform. Information saved on these sites and platforms is critical financial data of consumers and is considered sensitive personal data. The risk of misuse of such sensitive financial data by hackers or fraudsters looms over every individual, and cases of such misuse have garnered the attention of the authorities.

The RBI, through its notification dated 17th March 2020 had made it mandatory for payment aggregators to disable the storage of customer card credentials within the database or server of the company. Though a fixed date for implementation of this rule was not decided, RBI later issued notifications directing merchants to comply with this recommendation of not storing card data by 31st December 2021. Since then, the RBI has been extending the timeline for implementing tokenisation and as of today, the RBI has instructed all parties to delete the card information before 1st October 2022.

Card tokenisation is a process by which sensitive data of the cardholder is removed from the sites/platforms and replaced with randomly generated numbers and letters from the company’s internal network called tokens.


History


The groundwork for regulating this space of online payment and ensuring the safety of cardholders has been in line for a couple of years. As India is yet to formulate a dedicated data protection bill, the safety of a cardholder’s sensitive personal data stored on the merchant’s website was one of the major concerns of cardholders as well as the regulators. Moreover, the increase in data theft and leakage of debit and credit card details of cardholders did not really help in containing the concerns of the stakeholders.

In January 2019, the RBI released a notification whereby it permitted card networks to tokenise. This choice of tokenisation was made optional for the customers, and the permission was extended to all use cases like QR code-based payments, NFC, etc. However, such services could only be offered through mobile phones and tablets, and no other devices were permitted to offer such a facility at that time.

RBI later released the guidelines on the Regulation of Payment Aggregators and Payment Gateways, which made it mandatory for a payment gateway to not store customer card credentials within the database or on the server accessed by the merchant, with effect from 30th June 2021. This move reiterated the importance of safeguarding customer card details and the focus once again shifted to the introduction of a tokenisation scheme. Though the guidelines did not mention specifically tokenisation, they did find mention in the subsequent notification released by the RBI on Payment Aggregators and Payment Gateways on March 31, 2021. The guidelines called upon payment system providers to put in place workable solutions such as tokenisation to safeguard the interests of the cardholder.  In order to eliminate any ambiguity in the definition of ‘payment aggregators’ as defined in the Payment Aggregators Guidelines, the RBI explicitly stated that the Payment Aggregators Guidelines applied to e-commerce marketplaces that engaged in direct payment aggregation, and to that extent, e-commerce online markets that used the services of a payment aggregator were to be regarded as merchants.

The RBI further released a notification in August 2021 amending the 2019 notification by extending the scope of permitted devices that could use tokenisation. The present framework for tokenisation was extended to include consumer devices such as laptops, IOT devices, wearable devices, etc. A subsequent notification issued in September 2021 further allowed card-on files tokenisation. This notification permitted card issuers to offer the services of tokenisation as Token Service Providers (TSPs). The TSPs were permitted to tokenise only those cards that were affiliated with or issued by them. The notification also emphasised that no entity in the card transaction/payment chain, other than the card issuers and/or card networks, shall store the actual card data from 1st January 2022. Entities were only allowed to store limited data, like the last four digits of the actual card number and the card issuer’s name, for compliance and tracking purposes.

The earlier notification of removing all card details of customers with effect from 30th June 2021 was again extended to 31st December 2021 in view of the huge compliance hassle. This was again extended until 30th June 2022 and finally, the government set the latest deadline on 1st October 2022.


Functioning of Tokens


An e-commerce website, mobile application, or any merchant site for that matter, offers different payment methods to its consumers, which may range from cash to debit/credit card payment to UPI. When it comes to the authentication of the debit or credit card used by the consumer, the entire responsibility for authenticating the same vests is with the Payment Gateway service provider. The e-commerce platform or websites merely act as an intermediary to facilitate the trade and it is the responsibility of the Payment Gateway service provider to provide the technology to these platforms and websites that authenticates the card details. This process of authentication done by the Payment Gateway service provider is known as 2FA i.e., two-factor authentication. The process of authentication involves the registered bank of the customer sending a Time Password (OTP) to the registered phone number of the consumer to close the transaction. The OTP is the key that helps authenticate that the customer is the rightful owner of the card. Upon entering the correct OTP, the Payment Gateway service provider authenticates it and completes the transaction.

In general, a merchant website or an online portal is only allowed to store details like the cardholder’s name, the 16-digit number on the front of the card, the expiration date of the card and the service code, which is located within the magnetic stripe of the card. On the other hand, these portals and sites are strictly prohibited from storing information such as full magnetic stripe information, PIN, PIN Block and CVV/CVC number of the card.

After the guidelines kicked in on October 1, all the card details of individuals stored on the merchant’s website were automatically erased. All information concerning the cardholder, like the expiry date, PAN, etc., is replaced by the token. This token is a one-time alphanumeric number that has no connection with the cardholder’s account. Unlike the previous system, these tokens so generated do not contain any sensitive personal data of the cardholder.

An individual can tokenise his/her card in the following ways:

  1. The individual will have to visit the preferred merchant’s website for the purchase of any goods or services.
  2. The website will then direct the individual to the preferred payment option, and the individual will be able to enter his/her card details and initiate the transaction.
  3. The website will also contain another option called “secure your card as per RBI guidelines,” which basically generates tokens for the card.
  4. As soon as the individual opts for that option, a One-time Password (OTP) will be generated and sent via SMS or email to the individual.
  5. With the OTP being entered, card details are sent to the bank for tokenisation, which is then sent back to the merchant for storing the same for the purpose of customer identification.

The token so generated from one merchant website will not be applicable to every other merchant website. The cardholder will have to create separate tokens for each merchant website, and the use of the same token will not help in initiating the transaction.


Benefits of Tokenisation


Many customers today prefer digital payment over the traditional mode, mainly due to the convenience of not carrying hard cash.  Since the frequency of transactions across such an online medium among customers rose significantly, they preferred to save the card details on the online portal for convenience’s sake. As the sensitive personal data of customers is stored in such portals, there is always a risk of leakage, theft, or merchant access to such information. Hence, tokenisation provides much-needed safety and assurance, which helps in not exposing the customer’s card details.

Tokenisation helps reduce data theft and leaks, as the tokens are in no way connected to an individual’s personal information. Moreover, the process of replacing sensitive personal information with tokens helps build trust and confidence among consumers.


Effects of these Regulations on the Industry


The RBI is striving to organize payment aggregators by bringing non-banking payment aggregators under its regulation. The RBI’s main goal in introducing these guidelines is to reduce fraud and protect customers’ interests. Placing the burden on payment aggregators to ensure that merchants are genuine and have no malicious intent will go a long way towards removing dishonest merchants from the market and safeguarding customers’ interests.

Payment Aggregators are instructed to credit reimbursements to the primary payment source rather than the e-wallet account. Previously, refunds were credited to an e-wallet, posing a challenge for consumers to utilize the monies somewhere else.

Although the RBI has reduced the required net worth from INR 100 crores to INR 25 crores, it will not be sufficient for small-sized entities (including start-ups) seeking to enter the industry. Many existing players will be forced to exit the market if they fail to meet the net worth requirements. Moreover, small businesses operating as payment aggregators would find it difficult to implement the required baseline technology suggestions owing to the high implementation costs. This will result in the removal of market competition, leading to an oligopoly, which would harm merchants’ interests in the long term.

It can be stated that these guidelines represent an important advancement in the Indian fintech industry and assure that customers’ overall interests are secured.

Conclusion

With the current atmosphere where there is intense scrutiny over an individual’s personal information, the scheme of tokenisation is a breath of relief for a lot of privacy enthusiasts and the public in general.

Image Credits: Photo by energepic.com

Many customers today prefer digital payment over the traditional mode, mainly due to the convenience of not carrying hard cash.  Since the frequency of transactions across such an online medium among customers rose significantly, they preferred to save the card details on the online portal for convenience’s sake. As the sensitive personal data of customers is stored in such portals, there is always a risk of leakage, theft, or merchant access to such information. Hence, tokenisation provides much-needed safety and assurance, which helps in not exposing the customer’s card details.

POST A COMMENT

The Best Time to Enact Data Protection Laws was 20 Years Ago; The Next Best Time is Now!

The road to personal data protection in India has been rocky. In 2017, India’s Supreme Court upheld the right to privacy as a part of our fundamental right to life and liberty. A panel chaired by retired Justice B N Srikrishna was given the task of drafting a Bill. In 2018, this panel submitted its draft to the Ministry of Electronics & Information Technology. The Personal Data Protection Bill that was eventually tabled in parliament in December 2019 proposed restrictions on the use of personal data without the explicit consent of citizens and introduced data localization requirements. It also proposed establishing a Data Protection Authority.

However, the bill was widely seen as a diluted version of what was originally envisioned by the Srikrishna panel in terms of its ability to truly protect the data/privacy of individuals. The bill was seen to place a significant regulatory burden on businesses and thus viewed as an impediment to the “ease of doing business” in India. A major bone of contention was the bill granting the government a blanket right to exempt investigative agencies from complying with privacy and data protection requirements. Understandably, there was pushback from BigTech, global financial services players as well as activists; even startups were unhappy with the proposed regulatory burdens.

In December 2021, after a number of extensions spanning over two years, the Joint Parliamentary Committee (JPC) that was set up to examine the draft bill submitted its report to the Lok Sabha. The JPC report has reportedly highlighted areas of concern and proposes a number of amendments/recommendations such as:

  • a single law to cover both personal and non-personal datasets;
  • using only “trusted hardware” in smartphones and other devices;
  • treating social media companies as content publishers, thus making them liable for the content they host.

In early August 2022, the government withdrew the Personal Data Protection Bill, 2019, with the promise to introduce a new one with a “comprehensive framework” and “contemporary digital privacy laws”.

 

India needs New Regulations to Plug the Data Protection Gap

That India needs robust data protection and privacy regulations which should be enacted soon is beyond debate. With digitalization becoming ever more pervasive by the day, the longer we are without clear regulations, the greater the risk is to our citizens. Each of the major trends below has the potential to infringe on individual privacy and can give rise to large-scale risks of user data (including personally identifiable information) being leaked/breached and misused:

  • The growth in digital banking, payment apps and other digital platforms.
  • The potential for Blockchain-based apps (in education- e.g., degree certificates, mark sheets; in health care – medical records; in unemployment benefits; KYC, passports etc.).
  • The growing popularity of crypto assets (and the attendant risk of them being used for money laundering, funding terror/anti-national activities etc.).
  • The rise of Web 3.0.
  • The increase in the use of drones for civilian purposes (e.g., delivery of vaccines, food to disaster-hit areas etc).
  • The emergence of the Metaverse as a theatre of personal/commercial interactions.

According to a news report, IRCTC had sought the services of consultants to help them analyze the huge amount of customer data they have and explore avenues to monetize the information. Given that the existing bill has been withdrawn, they have deferred this plan till new legislation is in place. Delays in enacting new data protection legislation thus also can impact revenue growth and profitability of various businesses- which is another reason for quickly coming up with new legislation.

 

The New Data Protection Law should be Well-defined and Unambiguous

While “consent” must be a cornerstone of any such legislation, the government must also ensure that users whose data need to be protected, fully understand the implications of what they are consenting to. For example, each time an individual downloads an app on his/her smartphone, the app seeks a number of permissions (e.g., to mic, contacts, camera etc.). As smartphones become repositories of larger slices of personally identifiable information as well as financial data (such as bank/investment details), and authentication details such as OTPs, emails etc., the risks of data breaches and misuse that cause serious harm increase. There are a number of frauds and digital scams to which citizens are falling prey. Commercial and other organizations that build and manage various digital platforms must be held accountable for what data they capture, how they do so, why they need the data, how/where they will store such data, who will have access to them etc.

Just as important is for the new law to define unambiguously terms like “critical data”, “localization”, “consent”, “users”, “intermediaries” etc. Many companies are establishing their Global Captive Centres (GCCs) in India, to take advantage of the large talent pool and process maturity. Strong laws will encourage more layers to consider this route seriously, thereby adding to jobs and GDP growth. Such investments also make it easier for India to be a part of emerging global supply chains for services (including high-value ones such as R&D and innovation).

It must address the risks of deliberate breaches as well. For instance, if hybrid working models are indeed going to remain in place, who should be held responsible for deliberate data leaks by employees working remotely? Or by their friends/relatives/others who take screenshots (or otherwise hack into systems) and share data with fraudsters?

While fears of an Orwellian world cannot be overstated, India’s new data privacy/protection legislation must be sufficiently forward-looking and flexible to give our citizens adequate safeguards. If the government fails to do so, our aspirations to become one of the top three nations on earth will take much longer – worse, they main only remain on paper as grandiose but unfulfilled visions.

Picture Credits: Photo By Fernando Arcos: https://www.pexels.com/photo/white-caution-cone-on-keyboard-211151/ 

While fears of an Orwellian world cannot be overstated, India’s new data privacy/protection legislation must be sufficiently forward-looking and flexible to give our citizens adequate safeguards. 

POST A COMMENT

Securing your Data with the Trade Marks Registry

Data privacy has been a cause of concern for individuals and corporates, however, when sharing personal information with government authorities, we tend to overlook this concern. Has one ever wondered how secure her confidential, proprietary, or personal information is while sharing it with a government agency like the Trade Marks Registry?

Indian Intellectual Property Offices come under the Ministry of Commerce and Industry; therefore, they are under the control of the Central Government. The Trade Marks Registry, established in 1940, primarily acts as a facilitator in matters relating to the registration of trademarks in India.

The Trade Marks Registry (TMR) is a public filing system. That means once a trademark application is filed with the TMR, a lot of information is placed on record, including the applicant’s and its representative’s personal data, such as mailing address, and the proof of use of the trademark. The digitization of the Registry in 2017 prompted the current practice of recording information on a public access system.

 

Fundamental Concerns

Mailing Address: Open and easy access to such personal information exposes an applicant to scams and other unwanted solicitations. For instance, scam emails (that appear to have been sent by the TMR seeking maintenance fees) from third parties attempt to deceive applicants into paying additional fees. Everyone recalls how anyone who filed an international application between 2005 and 2015 was duped by international scammers who obtained their information from the WIPO. By oversight, many people were duped into paying huge amounts of money.

If an attorney represents an applicant, the TMR does not send correspondence about the trademark application directly to the applicant. In such cases, the Registry directly communicates with their authorised attorneys. Hence, if an applicant receives any mail relating to their trademark, they should consult their attorneys, who may evaluate it to guarantee that a scam letter is not mistaken for real contact.

Documents to support the use of the mark: Applicants are frequently required to submit documentary evidence to support their applications and commercial use of their marks. Such evidence is often public, but an applicant might disclose information they would not intend to make public, such as bills, financial papers, reports, and other confidential information. There is no mechanism to have them masked or deleted from the TMR’s database if such information is uploaded or disclosed.

 

Initiatives by the Trade Mark Registry

In recent times, the TMR has adopted the practice of restricting public access to evidentiary documents submitted during opposition/rectification proceedings that the competing parties upload on the TMR. However, similar documents filed during any other stage, such as filing and pre-opposition prosecution, are still exposed to public access, even if they are documents or information relating to commercial confidence, trade secrets, and/or any other form of confidential, proprietary, or personal information.

However, the advantage of such an open and publicly available database is that it serves as a countrywide “notice,” which means that an alleged infringer of your trademark cannot claim ignorance of your brand. However, disclosure of such information exposes applicants to email scams and other unwanted solicitations and can also harm their competitive position in the market.

In September 2019, on account of various representations made by numerous stakeholders regarding the TMR’s display of confidential, proprietary, and personal information,[1] a public notice was issued by the Registry, inviting stakeholders’ comments on the aforesaid concerns.

The TMR proposed the classification of such documents into two categories:

  • Category I: Documents that are fully accessible and available for viewing or downloading by the public.
  • Category II: Documents for which details will be available in the document description column, but viewing and downloading will be restricted.

 

Roadblocks and Viable Course of Action

Notably, the Right to Information (RTI) Act, 2005, obligates public authorities to make information on their respective platforms available to the public in a convenient and easily accessible manner. There are some notable exceptions to this rule, i.e., information related to commercial confidence and trade secrets is exempted from being disclosed or made accessible to the public in so far as their disclosure leads to a competitive handicap for the disclosing party. Personal information is also exempted to the extent that its disclosure leads to an invasion of privacy or if it has no relation to public activity or interest.

Hence, it is crucial to understand that while such a classification, as has been suggested by the TMR above, might seem like a good initiative on the surface, the lack of any concrete boundaries assigned to the terms “confidential” or “personal” information leaves the Registry with unquestioned discretion to generalise datasets and to restrict access to documents on the TMR website. A simple example could be data collected by the TMR through pre-designated forms, including Form TM A, Form TM O, etc. Most of these forms generally mandate the submission of certain personal information, including the proprietor’s name, address, telephone number, etc. However, this cannot simply mean that the TMR denies the general public access to such trademark application forms, as this would defeat the primary goal of advertising such marks on the Registry, which is to seek any opposition or evidence against such marks. Thus, while the objective behind such a classification of documents might be well-intended, restriction of access to certain documents might lead to a conflict of interest for the TMR, and it might end up over-complicating the due-diligence processes, leading to increased costs and resources.

Such generalised classifications are, hence, only viable in theory. The TMR might end up entertaining hundreds of RTI applications if it decides to limit access to certain documents, which might be necessary for proper due diligence and prosecution. The free and open availability of documents enables the public to have smoother and easier access to essential records and credentials of the trademark proprietors, thereby allowing the masses to have a better understanding of the prosecution history of important trademarks of the target company.

In the long run, a rather sustainable alternative for the TMR might be introducing a multi-factor authentication system for the parties interested in carrying out due diligence or prosecution against a mark. A multi-factor authentication system for gaining access to the records and documents on the Registry might lengthen the entire process in the short run. Nonetheless, the move could be game changer in the long run because it would allow the Registry to restrict access to confidential and personal data of its users to parties with an original or vested interest in the registration of a mark.

Such an approach would not only enable the Registry to provide open and efficient access to necessary documents to the parties who have an original or vested interest in the registration of a mark, but it would simultaneously vest it with the flexibility to protect the sensitive, confidential, as well as personal data of its users from scammers or non-interested parties.

 

Privacy-by-Design

A Privacy-by-Design approach is the future of the modern-day web, and as long as the Registry does not implement more elaborate internal safeguards on its website and databases to protect the privacy and integrity of public data contained therein, it is always recommended that applicants work with an experienced trademark attorney who can assist applicants in reducing the exposure of their information to individuals or a class of individuals with ulterior motives and mitigating the harm associated with the usage of their data.

References:

[1] Public Notice dated 06/09/2019 re Categorization of Documents on the TMR. Accessible at: https://ipindia.gov.in/writereaddata/Portal/Images/pdf/Catergorization_of_Docs.pdf.

The Trade Marks Registry (TMR) is a public filing system. That means once a trademark application is filed with the TMR, a lot of information is placed on record, including the applicant’s and its representative’s personal data, such as mailing address and the proof of use of the trademark. 

POST A COMMENT

The Metaverse and its Numerous Concerns

There is a lot of buzz being generated around the “Metaverse,” which can be defined as a virtual reality-based shared digital world in which users (through their “avatars”) can enjoy three-dimensional, multi-sensory experiences. This rapidly-evolving, technology-driven paradigm is a huge shift away from the present, where digital interactions are based on text, audio and two-dimensional images/videos. The excitement around the Metaverse is due to the immense possibilities that exist around how it can be used for social interactions, commerce, media & entertainment, education, manufacturing, healthcare, defense etc. Not surprisingly, many companies, even in India, are investing in Metaverse capabilities.

While the potential for metaverse cannot be denied, it is just as important to recognize and acknowledge that there are several grey areas around this paradigm. If timely actions to prevent the misuse of the metaverse are not taken by the global community, we run the serious risk of opening a new Pandora’s Box. And once the proverbial genie is released from the bottle, it is virtually impossible (pun intended) to put it back inside.

The Potential Dangers of the Metaverse

 
What are the biggest fears surrounding the Metaverse? Concerns have been expressed from different quarters around issues relating to the privacy, safety and well-being of people who are active in the metaverse. In the current scenario, people use social platforms to connect with each other. If someone with whom I do not wish to engage seeks to connect with me in a basic digital world, I can easily deny the friend request. Even after having granted them permission initially, I can choose to block such persons. During the time they have permission to engage with me, the worst that can happen is that they send unwanted texts, audio messages or images and videos.

This is bad enough, but in the metaverse, the kind and nature of obscene or harmful content will change drastically; consequently, so will the impact of such material and experiences on vulnerable segments of society. 

For example, in the metaverse, it is quite possible for complete strangers to enter someone else’s personal space – without the latter being aware of who the former is. Given the multi-sensory capabilities of the metaverse, which includes haptic technology (the sense of touch), the experience and impact can be far worse. Arguably, the metaverse (as it exists currently) lends itself more easily to bullying, sexual abuse or intimidation. Indeed, there have been recent media reports that some VR-based games that are accessible to young children contain inappropriate content. 

AI-driven deep fakes can further muddy the waters by creating and distributing patently false content that is almost impossible to detect as fake. There is enough fake information circulating on Whatsapp as it is, think of the danger of content that purportedly shows politicians or others saying things designed to inflame emotions.

NFTs will be key to the evolution and growth of the metaverse, providing owners of physical assets such as paintings and IPR such as rights to music, movies etc. new avenues to monetize them at scale. Cryptocurrencies and tokens are likely to form the principal currency in the metaverse, powering commerce and payments. As of now, cryptocurrencies are anonymous and independent of mainstream banking and financial systems. 

In the absence of regulations that are uniformly enforced globally, such parallel payment systems can be easily misused for illegal and immoral activities and transactions, including child sexual abuse. It is likely that fraud and crimes will increasingly crisscross between the current digital world and the metaverse (and perhaps the physical world), making them harder to detect and bring the perpetrators to book.

Addressing the Issues Surrounding Metaverse 

 

A multipronged approach is key to addressing the potential dangers of the metaverse. It is vital to frame appropriate legislation and arm various regulatory agencies with the power to catch and punish violators is vital. The basic premise around legislation has to be this: if something is illegal or against the law or generally accepted social mores in the “real”, physical world, it must be treated the same way in any parallel “virtual reality” based universe.

However, legislation alone cannot secure the metaverse. It will be essential to hold creators of content and platforms that enable distribution and access responsible for violations. The metaverse infrastructure needs to be designed with more intent to put in place appropriate safety mechanisms right at the beginning. As a global society, we must learn from our experiences with the downsides of social media platforms (false information, cyber-bullying, digital fraud etc.) and take preemptive actions that can prevent problems before they become common. This is significant because changing processes after people have grown accustomed to them is never easy; also, some damage may have already occurred. It may also be necessary to think of ways to incentivize good behaviour in the metaverse.

The metaverse is expected to surge ahead quickly on its evolutionary path. Its trajectory cannot be predicted in advance, therefore, what is needed is constant vigilance and for global action to be taken in a concerted manner. The UN system is supposed to be the primary keeper of international order. A number of events over the past couple of decades have painfully driven home the point that the UN architecture needs an urgent and major overhaul. As part of this exercise, it may be useful to establish a new global body tasked with the responsibility of overseeing and governing the metaverse. Regional political/economic blocs must be encouraged to ensure that their members comply with rules and regulations related to the metaverse.

The metaverse is expected to surge ahead quickly on its evolutionary path. Its trajectory cannot be predicted in advance; therefore, what is needed is constant vigilance and for global action to be taken in a concerted manner.

POST A COMMENT

IS17428 -A New Privacy Assurance Standard in India

Recently, Aditya Birla Fashion and Retail Ltd (ABFR) faced a major data breach on its e-commerce portal. As per the reports, personal information of over 5.4 million users of the platform was made public. The 700 GB data leak included personal customer details like order histories, names, dates of birth, credit card information, addresses and contact numbers. Additionally, details like salaries, religion, marital status of employees were also leaked.  Forensic and data security experts were pro-actively engaged to implement the requisite damage-control measures and launch a detailed investigation into the matter.[1] This demonstrates the need to have wider awareness and establish standardized protocols for personal data management. 

The battle of data protection and privacy currently stands at a juxtaposition with a flourishing data economy. 2021 was a watershed moment in the privacy & data protection dialogue in the country. The need for comprehensive data protection law was louder than ever and there were major initiatives on the legislative and executive front.

In June of 2021, the Bureau of India Standards (BIS) introduced IS 17428 for data privacy assurance. It is a privacy framework designed for organisations to handle the personal data of individuals that they collect or process. The certification provided by BIS for IS 17428 can be deemed as an assurance extended to the customers/users by the organizations of well-implemented privacy practice. The BIS being a statutorily created standard-setting body of our country will bring some welcome change in our data management.  

IS 17428 is divided into 2 parts[2]:

  • Part 1 deals with the Management and Engineering parameters that are mandatory for an organization to comply with. This part provides for establishing and cultivating a competent Data Privacy Management System.
  • Part 2 deals with the Engineering and Management guidelines which enable the implementation of Part 1. These guidelines are not mandatory in nature but a reference framework for an organization to implement good practices internally.

 

The Context – Privacy & Data Protection laws in India

 

The Data protection bill was expected to be tabled in parliament back in 2019 but was postponed due to the ongoing pandemic. The country was hoping to pass the bill last year, however, it was sent to the Joint Parliament Committee (JPC) for perusal. The JPC made its report on the bill public in the month of December 2021.

Also, Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 had been implemented back in 2011, primarily to safeguard the sensitive personal data of individuals that are collected, processed, transferred, or stored by any organisation and enumerate security practices. The rule lays down certain practices and procedures to be followed by a stakeholder while dealing with sensitive personal data. International Standard IS/ISO/IEC 27001 is one such acceptable standard.

Later ISO27701 was specifically introduced that focused on Privacy Information Management.  However, our Indian enactment has not specifically endorsed any such standards though Standards formulated by the industry association that is approved and notified by the Central Government are also deemed appropriate.  In this background, BIS introducing a standard is a welcome initiative as it will help in bringing uniformity in terms of the implementation of privacy practices across Indian industries.

Components of Part 1 of IS 17428[3]

 
Development of Privacy Requirements:

While developing the privacy requirements of the organisation in relation to the data collected or processed, the organisation has to take into consideration various factors such as jurisdiction, statutory requirements and business needs.

Personal Data Collection and Limitation:

The organisation is permitted to collect the personal information of the individuals, provided the same has been consented to by such individuals.

Privacy notice: 

The organisation is bound to provide a notice to individuals while collecting information from them and when such collection is through an indirect method employed by the organisation, then it is the duty of the former to convey by the same in an unambiguous and legitimate means.

The contents of a privacy notice at the minimum should include the following[4]:

  • Name and Address of the entity collecting the personal data
  • Name and Address of the entity retaining the personal data, if different from above
  • Types and categories of personal data collected
  • Purpose of collection and processing
  • Recipients of personal data, including any transfers
Choice and Consent:

As mentioned earlier, while collecting information, the organisation should get the consent of the individual at the initiation of the process while offering such individuals the choice of the information that they consent to disclose. This entire process should be done in a lawful manner and according to the privacy policies implemented by the organisation.

Data Accuracy: 

The data collected by the organisation should be accurate, and in case it is inaccurate, it should be corrected promptly.

Use Limitation: 

The data collected by the organisation should be used for the legitimate purpose for which it was agreed upon and it shall not be used for any other purposes.

Security: 

The organisation should implement a strict security program to ensure that the information collected is not breached or compromised in any manner.

Data Privacy Management System: 

The organisation is required to establish a Data Privacy Management System (DPMS). The DPMS shall act as a point of reference and baseline for the organisation’s privacy requirements/objectives.

Privacy Objectives: 

The privacy objective of the organisation shall be fixed and set out by the organisation itself. While determining the objectives the organisation shall also look into various factors such as the nature of business operations involving the GDPR processing of personal information, the industry domain, type of individuals, the extent to which the processed information is outsourced and the personal information collected. Moreover, the organisation shall also ensure that the objectives are in alignment with its privacy policy, business objectives and the geographical distribution of its operations.

Personal Data Storage Limitation: 

The organisation shall be allowed to retain the information collected from the individual only for a specific time period as required by the law or the completion of the purpose for which it was collected in the first place. The individual shall have the right to delete their personal information from the organisation database upon request.

Privacy Policy: 

The organisation shall create and implement a privacy policy that shall determine the scope and be applicable to all its business affiliates. The senior management of the organisation shall be in charge of the data privacy function. Moreover, the privacy policy should be in consonance with the privacy objectives of the organisation.

Records and Document Management

The organisation shall keep a record of its processing activities which shall, in turn, ensure responsibility towards the compliance of data privacy. The possible way to achieve such a standard is to lay out procedures that help to identify various records. While laying out procedures, the organisation shall take into consideration certain factors such as a record of logs that demonstrate affirmative action and options chosen by individuals on privacy consent and notice, evidence of capture events related to access or use of personal information, and retention period of obsolete documents.

Privacy Impact Assessment: 

A privacy impact assessment shall be carried out by the organisation from time to time. Such an assessment shall help in estimating the changes and the impact that they can possibly have on the data privacy of the individuals.

Privacy Risk Management

The organisation shall put in place and document a privacy risk management methodology. The methodology shall determine how the risks are managed and how the risks are kept at an acceptable level.

Grievance Redress:  

A grievance redressal mechanism shall be established by the organisation to handle the grievances of the individuals promptly. The organisation shall ensure that the contact information of the grievance officer shall be displayed or published and that they have the channel of receiving complaints from the individuals. Moreover, the organisation shall also make it clear as to the provision for escalation and appeal and the timelines for resolution of the grievance.

Periodic Audits: 

The organisation shall conduct periodic audits for the data privacy management system. The audit shall be conducted by an independent authority competent in data privacy, internal or external to the organization, at a periodicity appropriate for the organization, at least once a year.

Privacy Incident Management: 

Privacy breaches and data privacy incidents shall be reported regularly and the organisation shall come up with a mechanism to manage such incidents. The process shall involve identifying the incident at the first stage and investigating the root cause, preparing analysis and correcting the incidents in the second stage. The last stage is basically informing the key stakeholders including Data Privacy Authority about the breach or incident.

Data Subject’s Request Management: 

The organisation shall develop a mechanism to respond to requests from individuals concerning their personal data. This process shall include the means to verify the identity of the individual, provision access to the information and the means to update the information.

 

How IS 17428 would help in Privacy and Data Protection? 

 

The Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 (RSPP and SPDI rules) had been the only law for organisations to follow. The rules did not prescribe or detail any specific requirements or standards in relation to personal data management and in the absence of formulated standards for the protection of the sensitive personal data of individuals, industry bodies were struggling to have uniform procedures. 

This being the case, introducing specific standards for personal data management will bring more clarity and will help companies to adhere to an approved standard prescribed by a government agency. Moreover, principles narrated in this standard are in accordance with the Internationally recognised privacy principles and will help Indian companies to proffer confidence when dealing with their commercial counterparts.

Introduction of record and document management, risk assessment and data subject request management are a few of the aspects that bring onerous responsibilities on companies making them more accountable and transparent.  These aspects have laid down procedures and mechanisms for an organisation to improve their privacy management, for example, introducing processes such as verification of identity, access to information, evidence of capture events of consent and retention period of obsolete documents.

 

The proposed data protection legislation and the IS 17428

 

The IS 17428 standard has been inspired primarily from the principles dictated from OECD privacy principles, GDPR and ISO27701. The proposed data protection legislation on the other hand has many divergences from the above instruments in many respects. For Instance, the IS standard has an elaborate description provided for the privacy objective of the organisation and the factors that need to be taken into account. Most of these objectives are covered under Sections 22 and 23 of the draft Bill but nevertheless, the standard has recommended a few other factors such as geographical operation, industrial domain and type of individuals as specific factors to be taken into consideration while drafting the privacy objectives. How much discretionary privacy standards can be created, what is allowed freedom for industries in this regard is unclear.

Section 28 of the draft bill talks about the records and document management of the data collected or processed and the standard covers almost every bit of the section. In addition to the consideration mentioned under the bill, the standard goes forward and echoes the need to establish a policy on the preservation of obsolete policies and process documents. Data and record-keeping should be for a defined period. The majority of other legislation prescribes an average of 7 years of data-keeping. Keeping any data beyond such a reasonable period may not serve many purposes. Why this standard has prescribed such obsolete data retention is again unclear.

The standard could be made effective by only having an enactment for data protection legislation in place. For instance, the grievance redressal mechanism, though the standards do envisage an appeal mechanism, they do not establish appeal machinery. This part of the standard can be put to use only after the Data Protection Authority as per section 32 is constituted. The standard also calls for an investigative process in the event of any breach or compromise of data. The organisation is welcome to conduct an onsite or internal investigation into the breach or incidents, but once again an independent authority to investigate in a legitimate and fair manner is required.

In short, I am afraid, has it failed to take into account the special requirements contemplated under the PDPB, 2019 which may eventually become the law of the country thereby, once this law is enacted, this standard will also be required to be modified. The government has not made any announcement as per the RSPP and SPDI rules, that IS 17428 is an appropriate standard certifying the compliance of personal data management. In the absence of such explicit endorsement, the ambiguity continues as to whether the adoption of this standard is sufficient compliance under the said rules.

Finally, with the Data protection bill around the corner, the Data Protection Authority envisaged being constituted under the legislation which shall have the power to issue code, guidelines, and best practices for protecting the privacy of data subjects. How IS 17428 standards framed by the BIS will be looked at by the DPA or the proposed rule will offer a different set of practices shall be an interesting development to observe.

References:

[1] https://economictimes.indiatimes.com/industry/cons-products/fashion-/-cosmetics-/-jewellery/abfrl-faces-data-breach-on-its-portal/articleshow/88930807.cms

[2] The IS 17438 was established on November 20, 2020 and notified in the official gazette on December 4, 2020. Please see the notification available at: https://egazette.nic.in/WriteReadData/2020/223869.pdf (last visited Jan 18, 2022).

[3] Supra note 2.

[4] Sub-clause 4.2.2 of the IS Requirements: “Privacy Notice”.

 

 

Photo Credits:

Image by Darwin Laganzon from Pixabay 

Introduction of record and document management, risk assessment and data subject request management are a few of the aspects that bring onerous responsibilities on companies making them more accountable and transparent.  These aspects have laid down procedures and mechanisms for an organisation to improve their privacy management, for example, introducing processes such as verification of identity, access to information, evidence of capture events of consent and retention period of obsolete documents.

POST A COMMENT

Non-Personal Data Governance Framework, 2020

The realm of the internet has become an information powerhouse and data has become the new endowment of resources that governments and corporate entities are eager to tap into. The transformation in the digital environment and the emergence of information-intensive services has made data a necessary raw material for most undertakings.

Reports suggest that every minute Instagram is flooded with 277,000 stories, Google has 4.4 million searches and Uber has over 9700 rides in 2019. Today, data is an asset to various businesses and holds importance while making investments, mergers, and acquisitions, and/ or direct monetization.

 

While the discussion on ‘personal data’ has been revolving around privacy and security concerns, non-personal data is being eyed as an economic opportunity to augment public or private interest which must not be squandered. Considering the value proposition attributed to non-personal data, the legal aspect was sought to be dealt separately from ‘personal data’ which would be governed by the Personal Data Protection Bill, 2019 that is in the brink of finalization.

 

Consequently, an Expert Committee (“Committee“) was constituted by the Ministry of Electronics and Information Technology (“MeitY“) to study various issues relating to non-personal data. The Committee submitted its Report on Non-personal Data Governance Framework for comments from stakeholders in July 2020.

 

The report highlighted that data regulation is essential to utilize the maximum potential in data by realizing its economic, social, and public value. The need to regulate data stems from the imbalances in bargaining power between the companies that lead to the creation of data monopolies. Moreover, the privacy concerns revolving around the dilution of shared data must be tackled.

 

Non-Personal Data (“NPD“) is the data that cannot be identified with a particular individual, for example, weather forecast, traffic details, geospatial information, production processes, anonymized personal data, etc.

 

  1. Committee’s Proposal to Non-Personal Data Regulation

 

The NPD Governance Framework outlines norms for collection of data and data sharing by entities. The salient features of the proposed framework are:

 

  • The NPD framework provides key roles for all the participants such as Data Principal, Data Custodian, Data Trustees and Data Trusts.
  • Classification of NPD: Non-personal Data is further classified into Public NPD, Community NPD and Private NPD. Public NPD is NPD that is collected or generated by the government or by the agency of the government and includes data collected or generated in the course of execution of all publicly funded works (e.g. public health information, vehicle registration, etc.) excluding the one that is explicitly declared as confidential under the law. Community NPD is data about inanimate or animate phenomenon about a particular community of natural persons (e.g. data collected by e-commerce platforms or by telecom). Private NPD is NPD collected or produced by non-governmental entities or persons.
    • Ownership of non-personal data: In cases wherein, non-personal data is derived from personal data of an individual, the data principal for personal data will be the data principal for the NPD too. Further, the rights over the community NPD collected in India will vest in the trustee of such a community.
    • Sensitivity of NPD: The Committee has also defined a new concept of ‘sensitivity of NPD’, as NPD can also be sensitive from the perspective of: a) national security or strategic interests; b) sensitive or confidential information relating to businesses; and c) anonymized data, that bears a risk of re-identification.
    • Data Businesses and data disclosures: There is also the creation of a new horizontal classification called ‘Data Business’ which is when any existing business collects data beyond a threshold level. Such Data Businesses have to get themselves registered and furnish information on what they do/ collect, their purpose, and the nature of data stored. However, registration of Data Businesses collecting data below the threshold is not mandatory.
    • Non-Personal Data Regulatory Authority: NPD Regulatory Authority shall ensure that data is shared for sovereign, social and economic welfare, for regulatory and competition purposes, and also that all stakeholders adhere to the rules and data sharing requirements.
  1. Unanswered Questions: Shortcomings of the proposed Framework:

 

Attempting to govern the NPD is a commendable effort, however, it seems that there is a slew of questions that are left unanswered. The following are the issues relating to the proposed framework:

 

  • The foremost need to govern NPD as highlighted by the Committee is the imbalance in the digital ecosystem. However, neither the sources of these imbalances have been identified or analysed nor has it been clarified how the proposed regulations resolve these inequities.
  • Ambiguous classification of NPD: The various types of NPD have a potential overlap, but then again, clearly demarcating a line between the three types would be a difficult task. Also, one of the three types of NPD is Community NPD, however, there is no clarification as to how the ‘community’ would be determined. The definition of ‘community’ is wide, under the same even religious groups, residents of the same locality or same educational background would be a valid community, which may have conflicting interests over data shared with the government. Further, without any guiding principles, companies will be forced to make legally binding decisions on what they deem to be a valid community, the scope of data to be shared and for the resolution of competing claims, which is problematic at various levels. Moreover, on a particular dataset, there could be various interests, and in such cases, who would be entrusted with the data remains ambiguous.
  • Anonymization of Personal Data to Non-Personal Data: The process of converting personal data into Non-Personal Data by removing certain identifiers or credentials is termed as ‘anonymization’. Anonymization would undoubtedly convert a set of personal data into non-personal data but, such data runs the risks of re-identification. Further, although anonymization is essential, high anonymization could render the data over-generalized and futile.
  • Reactions of Stakeholders to the sharing of data: Mandatory data sharing is highly criticized by stakeholders, as it undermines the investments put in business and the value of intellectual property information the competitors would suffer. This ‘forced data sharing’ is counterproductive and would have a rather negative effect on foreign trade and investments. NPD can constitute trade secrets, that may be protected by IP laws, sharing this data raises concerns around the right to carry business and India’s obligation under international trade law. The purposes for data sharing under the framework are ‘sovereign’, ‘core public interest’, and ‘economic’ purposes which essentially covers all the data held by companies, and must be narrowed down.
  • Lack of Clarity on who really are trustees of Data: There is ambiguity regarding who will be a data trustee. Whether private, for-profit organizations or private entities within the government could be data trustees is not apparent. Also, the position regarding a data trustee’s independence and conflict of interest remains murky. It is essential that the roles and functions of these bodies are comprehensively defined.
  • User-Consent: NPD Framework also proposes that before the anonymization of data the consent of the user must be taken. It remains particularly unclear as to how would the consent be taken from them. Further, a company needs to invest in resources and obtain user consent, and sharing data may provide no incentive to such companies and would drown them into losses.
  • Over-Regulation by Non-Personal Data Authority: Creating altogether a new authority for NPD would lead to potential regulatory overlap given Data Protection Authority addresses and enforces privacy concerns and the Competition Commission of India looks over consumer welfare.
  1. Conclusion

This effort of the Ministry to set up a Committee to study the NPD which may subsequently lead to a legislation governing the NPD in India is praiseworthy, however, a lot of issues need reconsideration. Stakeholders have expressed anguish over the mandatory sharing of data and data disclosures as it conveniently overlooks the humungous investments put in by the companies. Further, the roles and functions of various entities under the framework are not clearly defined. The NPDA established under the framework may have functional overlaps with the CCI and the Data Protection Authority.

 

Moreover, there is ambiguity regarding Community NPD and user consent. There is no doubt that the ever-evolving nature of information technology is demanding as far as regulatory mechanism is concerned therefore the road ahead is arduous. Hopefully, the concerns raised are adequately addressed by the Committee and constructively resolved in favour of all the stakeholders.

Photo by Franki Chamaki on Unsplash

This effort of the Ministry to set up a Committee to study the NPD which may subsequently lead to legislation governing the NPD in India is praiseworthy, however, a lot of issues need reconsideration. Stakeholders have expressed anguish over the mandatory sharing of data and data disclosures as it outrightly overlooks the humungous investments put in by the companies.

POST A COMMENT

Core Legal Issues with Artificial Intelligence in India

The adoption and penetration of Artificial Intelligence in our lives today does not necessitate any more enunciation or illustration. While the technology is still considered to be in its infancy by many, so profound has been its presence that we do not comprehend our reliance on it unless it is specifically pointed out. From Siri, Alexa to Amazon and Netflix, there is hardly any sector that has remained untouched by Artificial Intelligence.

Thus, the adoption of artificial intelligence is not the challenge but its ‘regulation’ is a slippery slope. Which leads us to questions such as whether we need to regulate artificial intelligence at all? If yes, do we need a separate regulatory framework or are the existing laws enough to regulate artificial intelligence technology?

Artificial intelligence goes beyond normal computer programs and technological functions by incorporating the intrinsic human ability to apply knowledge and skills and learning as well as improving with time. This makes them human-like. Since humans have rights and obligations, shouldn’t human-likes have them too?

But at this point in time, there have been no regulations or adjudications by the Courts acknowledging the legal status of artificial intelligence. Defining the legal status of AI machines would be the first cogent step in the framing of laws governing artificial intelligence and might even help with the application of existing laws.

A pertinent step in the direction of having a structured framework was taken by the Ministry of Industry and commerce when they set up an 18 member task force in 2017 to highlight and address the concerns and challenges in the adoption of artificial intelligence and facilitate the growth of such technology in India. The Task Force came up with a report in March 2018[1] in which they provided recommendations for the steps to be taken in the formulation of a policy.

The Report identified ten sectors which have the greatest potential to benefit from the adoption of artificial intelligence and also cater to the development of artificial intelligence-based technologies. The report also highlighted the major challenges which the implementation of artificial intelligence might face when done on large scale, namely (i) Encouraging data collection, archiving and availability with adequate safeguards, possibly via data marketplaces/exchanges; (ii) Ensuring data security, protection, privacy and ethical via regulatory and technological frameworks; (iii) Digitization of systems and processes with IoT systems whilst providing adequate protection from cyber-attacks; and (iv) Deployment of autonomous products and mitigation of impact on employment and safety.[2]

The Task Force also suggested setting up of an “Inter–Ministerial National Artificial Intelligence Mission”, for a period of 5 years, with funding of around INR 1200 Crores, to act as a nodal agency to coordinate all AI-related activities in India.

 

Core Legal Issues

When we look at the adoption of artificial intelligence from a legal and regulatory point of view, the main issue we need to consider is, are the existing laws sufficient to address the legal issues which might arise or do we need a new set of laws to regulate the artificial intelligence technologies. Whilst certain aspects like intellectual property rights and use of data to develop artificial intelligence might be covered under the existing laws, there are some legal issues which might need a new set of regulation to overlook the artificial intelligence technology.

 

  • Liability of Artificial Intelligence

 

The current legal regime does not have a framework where a robot or an artificial intelligence program might be held liable or accountable in case a third party suffers any damage due to any act or omission by the program. For instance, let us consider a situation where a self-driven car controlled via an artificial intelligence program gets into an accident. How will the liability be apportioned in such a scenario?

The more complex the artificial intelligence program, the harder it will be to apply simple rules of liability on them. The issue of apportionment of liability will also arise when the cause of harm cannot be traced back to any human element, or where any act or omission by the artificial intelligence technology which has caused damage could have been avoided by human intervention.

One more instance where the current legal regime may not be able to help is where the artificial intelligence enters into a contractual obligation after negotiating the terms and conditions of the contract and subsequently there is a breach of contract.

In the judicial pronouncement of United States v Athlone Indus Inc[3] it was held by the court that since robots and artificial intelligence programs are not natural or legal persons, they cannot be held liable even if any devastating damage may be caused. This traditional rule may need reconsideration with the adoption of highly intelligent technology.

The pertinent legal question here is what kind of rules, regulations and laws will govern these situations and who is to decide it, where the fact is that artificial intelligence entities are not considered to be subject of law.[4]

 

  • Personhood of Artificial Intelligence Entities

 

From a legal point of view, personhood of an entity is an extremely important factor to assign rights and obligations. Personhood can either be natural or legal. Attribution of personhood is important from the point of view that it would help identify as to who would ultimately be bearing the consequences of an act or omission.

Artificial intelligence entities, to have any rights or obligations should be assigned personhood to avoid any legal loopholes. “Electronic personhood”[5] could be attributed to such entities in situations where they interact independently with third parties and take autonomous decisions.

 

  • Protection of Privacy and Data

For the development of better artificial intelligence technologies, the free flow of data is crucial as it is the main fuel on which these technologies run. Thus, artificial intelligence technologies must be developed in such a way that they comply with the existing laws of privacy, confidentiality, anonymity and other data protection framework in place. There must be regulations which ensure that there is no misuse of personal data or security breach. There should be mechanisms that enable users to stop processing their personal data and to invoke the right to be forgotten. It further remains to be seen whether the current data protection/security obligations should be imposed on AI and other similar automated decision-making entities to preserve individual’s right to privacy which was declared as a fundamental right by the Hon’ble Supreme Court in KS Puttaswamy & Anr. v Union of India and Ors[6]. This also calls for an all-inclusive data privacy regime which would apply to both private and public sector and would govern the protection of data, including data used in developing artificial intelligence. Similarly, surveillance laws also would need a revisiting for circumstances which include the use of fingerprints or facial recognition through artificial intelligence and machine learning technologies.

At this point in time there are a lot of loose ends to be tied up like the rights and responsibilities of the person who controls the data for developing artificial intelligence or the rights of the data subjects whose data is being used to develop such technologies. The double-edged sword situation between development of artificial intelligence and the access of data for further additional purposes also needs to be deliberated upon.

Concluding Remarks

In this evolving world of technology with the capabilities of autonomous decision making, it is inevitable that the implementation of such technology will have legal implications. There is a need for a legal definition of artificial intelligence entities in judicial terms to ensure regulatory transparency. While addressing the legal issues, it is important that there is a balance between the protection of rights of individuals and the need to ensure consistent technological growth. Proper regulations would also ensure that broad ethical standards are adhered to. The established legal principles would not only help in the development of the sector but will also ensure that there are proper safeguards in place.

In this evolving world of technology with the capabilities of autonomous decision making, it is inevitable that the implementation of such technology will have legal implications. There is a need for a legal definition of artificial intelligence entities in judicial terms to ensure regulatory transparency. While addressing the legal issues, it is important that there is a balance between the protection of rights of individuals and the need to ensure consistent technological growth.

POST A COMMENT

Bulk Data Sharing & Procedure Notification - A Data Breach?

In this digital era, data has become one of the most valuable assets to own. Elections have been won and international alliances have toppled because of support that could be garnered by utilizing data analytics. While heated debate surrounding data breaches by private entities baffles the world, at home, it is accused that the Indian Government has monetized from sale of personal data of Individuals, in the pretext of public purposes” under a notification released by the Ministry of Road Transport and Highways in March 2019 titled “Bulk Data Sharing & Procedure”.

In July 2019, a parliamentary debate pertaining to “sale of data” by the State was raised because the Government had provided access to databases containing driving license and vehicle registration details to private companies and Government entities and generated revenue out of them.  The two databases of Ministry of Road Transport and Highways named Vahan and Sarathi were under discussion.  These databases contained details such as vehicle owner’s names, registration details, chasis number, engine number, and driving license related particulars of individuals.  These details amount to personal information by which an individual could be identified (“Personal Data”).  

The sale of data was pursuant to a notification released by the Ministry of Road Transport and Highways in March 2019 titled Bulk Data Sharing & Procedure wherein a policy framework on sale of bulk data relating to driving license and vehicle registration was introduced.  Among other things, this writeup discusses whether such sale of Personal Data for revenue generation is acceptable in light of privacy as a fundamental right and the Data Protection Bill 2018? and whether such access constitutes data breach? 

 

Bulk Data Sharing & Procedure Notification 

The “Bulk Data Sharing & Procedure” notification by the Ministry of Road Transport and Highways states the purpose for which bulk data access would be  provided: 

it is recognized that sharing this data for other purposes, in a controlled manner, can support the transport and automobile industry.  The sharing of data will also help in service improvements and wider benefits to citizens & Government. In addition, it will also benefit the country’s economy”.  

As per the notification, only such entities that qualify the eligibility criteria would be provided access to bulk data.  The eligibility criteria are that an entity should be registered in India with at least 50% Indian ownership, such bulk data should be processed/stored in Servers/Data Centers in India, and the entity should have obtained security pre-audit report from CERT-In empanelled auditor.  The bulk data access would be provided for a price.  

Commercial organizations could have such data for an amount of INR 3 crores and educational institutions could have them for 5 lakhs.  As per the notification, the bulk data will be provided in encrypted form with restricted access.  Such entities would be restricted from any activity that would identify individuals using such data sets.  The entities would be required to follow certain protocols for data loss prevention, access controls, audit logs, security and vulnerability.  Violation of these protocols is punishable under the Information Technology Act, 2000. 

The Ministry of Road Transport and Highways has in accordance with this policy framework provided database access to 87 private companies and 32 government entities for a price of 65 crores resulting in Personal Data of all individuals being accessible to them.  The Data Principal (the individual whose information is in the database) has no knowledge or control over any use or misuse of his/her information.   

In any data protection framework worldwide, the Data Principal’s consent should be sought stating the purpose for which data ought to be used.  It is only pursuant to Data Principal’s consent that any information can be processed.  On the contrary, providing access to Personal Data to third party private companies without any consent of the Data Principal will keep them out of effective control.  This is against the basic principles of data protection. 

 

Proposed Legislation for Data Protection 

India is on the verge of a new Data Protection Act as the bill is being placed in the Parliament.  The Data Protection Bill, 2018 contains certain provisions to address the above-mentioned issues.  Section 5 of the Data Protection Bill states when personal data can be processed.  Personal Data shall be allowed only for such purposes that are  clear, specific, and lawful.  Section 5 is extracted below: 

  1. Purpose limitation— (1) Personal data shall be processed only for purposes that are clear, specific and lawful. (2) Personal data shall be processed only for purposes specified or for any other incidental purpose that the data principal would reasonably expect the personal data to be used for, having regard to the specified purposes, and the context and circumstances in which the personal data was collected.

Moreover, the relevant enactment regulating driving license and vehicle registration i.e. Motor Vehicle Act does not explicitly permit the State to sell or provide third parties access to Personal Data for generation of revenue.  Therefore, there is no clear, specific, or lawful indication of such access in the enactment.  The question arises whether access to bulk Personal Data can be interpreted as an “incidental purpose” that “data principal would reasonably expect”.  The data principal has provided this information only for the purpose of grant of motor vehicle license and vehicle registration.  The Data Principal ought not have expected his/her data to be sold by the Government. 

Section 13 of the Data Protection Bill is also of relevance here because it authorizes the State to process Personal Data for provision of services, benefit or issuance of certification, licenses or permits.  Section 13 is extracted below: 

Section 13 – Processing of personal data for functions of the State. — Personal data may be processed if such processing is necessary for excise of the functions of the State authorised by law for: (a) the provision of any service or benefit to the data principal from the State. (b) the issuance of any certification, license, or permit for any action or activity of the data principal of the State. 

 

By this section, the State is authorized to use Personal Data for grant of license or permits or to provide any benefit or service.  However, whether the State is authorized to give access to Personal Data to third party private companies is unclear. 

Section 17 of the Data Protection Bill tries to shed some light on this anomaly.  The section states that Personal Data may be processed for “reasonable purposes” after considering if there is any public interest involved in processing the same.  What constitutes reasonable purpose is yet to be specified by the Data Protection Authority to be constituted.  Section 17 is extracted hereunder: 

  1. Processing of data for reasonable purposes. — 

(1) In addition to the grounds for processing contained in section12 to section 16, personal data may be processed if such processing is necessary for such reasonable purposes as may be specified after taking into consideration— 

(a) the interest of the data fiduciary in processing for that purpose; 

(b) whether the data fiduciary can reasonably be expected to obtain the consent of the data principal; 

(c) any public interest in processing for that purpose; 

(d) the effect of the processing activity on the rights of the data principal; and 

(e) the reasonable expectations of the data principal having regard to the context of the processing. 

(2) For the purpose of sub-section (1), the Authority may specify reasonable purposes related to the following activities, including— 

(a) prevention and detection of any unlawful activity including fraud; 

(b) whistle blowing; 

(c) mergers and acquisitions; 

(d) network and information security; 

(e) credit scoring; 

(f) recovery of debt; 

(g) processing of publicly available personal data; 

(3) Where the Authority specifies a reasonable purpose under sub-section (1), it shall: (a) lay down such safeguards as may be appropriate to ensure the protection of the rights of data principals; and (b) determine where the provision of notice under section 8 would not apply having regard to whether such provision would substantially prejudice the relevant reasonable purpose. 

 

Section 17, therefore, clarifies that when there is any public interest involved, the State may provide access to publicly available personal data to third parties.  This read with Section 13 indicates that State is not required to get the consent of Data Principal in order to provide services and benefits.   

 

Whether the State has provided access to personal data for public interest or to provide services and benefits? 

The Bulk Data Processing & Procedure notification states that the purpose of providing access of bulk Personal Data is to “support the transport and automobile industry” & “help in service improvements and wider benefits to citizens & Government”.  Supporting the transport and automobile industry and improving services may qualify as public interest, whereas, mere revenue generation will not.  However, there is no clarification from the Government as to how these private companies to whom database access is being provided assist in public interest.  Further, whether all driving license and registration details related data can be classified as publicly available information is again contentious and questionable as the information provided therein is intended to be provided only to license holders & vehicle owners and is partially masked. 

In the event if this Personal Data is not construed as public data or these public companies have been given access to personal data in the absence of any public interest, it would result  in personal data breach by the Government Departments where the head of Department will be held liable as per section 96 of the Data Protection Bill. 

It is quite preposterous to note that on the one hand Data Protection Bill is being tabled in parliament and on the other, the Government is selling Personal Data of the general public for economic gains.  Whether it results in the exploitation of personal and private data on the pretext of public interest without an individual’s consent needs to be ascertained. 

Image Credits:

Photo by Markus Spiske on Unsplash

 

It is quite preposterous to note that on the one hand Data Protection Bill is being tabled in parliament and on the other, the Government is selling Personal Data of the general public for economic gains.

POST A COMMENT