Regulating Digital Media Platforms & Free speech rights

The recent political twitter battle where Mr. Rahul Gandhi tweeted that “BJP & RSS control Facebook & WhatsApp in India. They spread fake news and hatred through it and use it to influence the electorate,” which view was countered by the BJP and they alleged that employees of Facebook are on record abusing Prime Minister Narendra Modi and senior cabinet ministers. Mr. Shashi Tharoor who heads the the Parliamentary Standing Committee on Information Technology summoned facebook, about these reports & what they propose to do about hate-speech in India.


All these highlights the power and responsibility that social media platforms have been bestowed with.  One has to agree with the concerns aired by these leaders that digital platforms are a venue for abuse and have become a podium for copyright infringement, defamation, hate speech, violence, pornography, terrorism, anti-national activities and related offences. Regulating these harmful and illegal activities is essential for law and order, peaceful existence and upholding the constitutional values. From this perspective, it is pertinent to analyse the law applicable to the conduct of these digital media platforms.


During the last decade, the entire digital ecosystem has become increasingly platform-centric. There are specialized platforms for each kind of goods and services. There are platforms for vehicle hire, real estate, music and video sharing, news and Social media, etc. Most of them claim that they are only technology providers and not actual service providers or sellers. Probably, this positioning was done in order to get the benefits of Section 79 of the Information Technology Act, 2000 which exempts intermediaries from liabilities.


If the platforms are mere enablers for others to do their business, then they can very well make use of this provision and escape from all liabilities of the transactions or business carried out through that platform. Provisions similar to Section 79 had existed in most of the other jurisdictions and were called safe harbour provisions (for e.g. section 230 of the Communications Decency Act in USA,). However, when these enactments were made around the beginning of this new millennium, the nature of the internet and the activities using the internet were very limited.  This exemption was primarily for the benefit of telecom service providers and data carriers. In those days telecom companies were mere blind carriers of internet data and they simply facilitated data transfer from one computer to another, and therefore, exempting them from liabilities was logical. Similar was the case for other data storage providers and other intermediaries at that point in time. Jurists all over the world felt it appropriate not to charge or impose any liabilities on these internet facilitators.


But over the years, digital technology has evolved, search engines, browsers, digital platforms, etc. have become integral internet facilitators. The character of these providers has graduated from being mere facilitator to more intelligent service or sales enablers and on some occasions they are directly participating in rendering services or products. In the current day, these platforms are the backbone of the internet infrastructure and have occupied a crucial position in the scheme of things. Hence, granting the benefit of Section 79 to these platforms need to be strictly limited. As per Section 79 (2) ( b) (iii) only if an intermediary observes due diligence while discharging its duties under the law and also observes such other guidelines as the Central Government has prescribed, the exemption is available. It is also mandated that upon receiving actual knowledge, or on being notified by the appropriate Government that any information, data on the intermediary platform is being used to commit the unlawful act, the intermediary should expeditiously remove or disable access to that material without vitiating the evidence in any manner.


A decade later in 2011 Information Technology (Intermediaries guidelines) Rules, were notified and as per the same, every intermediary platform should have a user agreement and the said user agreement should restrict user from posting or hosting, displaying, uploading, modifying, publishing, transmitting, updating or sharing any information that

a) belongs to another person and to which the user does not have any right to;

b) is grossly harmful, harassing, blasphemous, defamatory, obscene, pornographic, paedophilic, libellous, invasive of another’s privacy, hateful, or racially, ethnically objectionable, disparaging, relating or encouraging money laundering or gambling, or otherwise unlawful in any manner whatsoever;

c) harm minors in any way;

d) infringes any patent, trademark, copyright or other proprietary rights;

e) violates any law for the time being in force;

f) deceives or misleads the addressee about the origin of such messages or communicates any information which is grossly offensive or menacing in nature; and

g) impersonate another person,

h) contains software viruses or any other computer code, files or programs designed to interrupt, destroy or limit the functionality of any computer resource;

i) threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign states, or public order or causes incitement to the commission of any cognisable offence or prevents investigation of any offence or is insulting to any other nation.

In 2017, in order to prevent sharing of content related to Child sexual abuse, ISP’s were asked to dynamically block and remove all those contents which were paedophilic or harmed minors using the Internet Watch Foundation list.


Parliament in recent years understood this requirement and has started to make special provisions imposing certain obligations to these internet platforms. For example, legal Metrology Package Commodity Rules mandated e-commerce players certain obligations on the content to be displayed on each item. Similarly, the Consumer Protection Act,2019 made separate e-commerce rules imposing certain obligations and duties on e-commerce players including what and how the information related to the product/service and the seller and manufacturer is to be displayed, the need of establishing a grievance redressal mechanism and appointment of a nodal officer to ensure compliance and for redressal of complaints by consumers.


However, in the case of digital media and communication platforms, in 2018, the government suggested some amendments to the intermediary rules and called for public comments. As per the proposed amendment, an intermediary upon receiving actual knowledge in the form of a court order, or on being notified by the appropriate Government or its agency under section 79(3)(b) of Act was required to remove or disable access to those unlawful content which were restricted by Article 19(2) of the Constitution of India i.e to an offence, on its computer resource without vitiating the evidence in any manner, as far as possible immediately, but in no case later than in the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, public order, decency or morality, or in relation to contempt of court, defamation or incitement twenty-four hours.


It was also proposed that intermediaries with more than fifty lakh users in India or in the list of intermediaries specifically notified by the government of India shall: (i) be a company incorporated under the Companies Act, 1956 or the Companies Act, 2013; (ii) have a permanent registered office in India with physical address; and (iii) appoint in India, a nodal person of contact and alternate senior designated functionary, for 24×7 coordination with law enforcement agencies and officers to ensure compliance to their orders/requisitions made in accordance with provisions of law or rules. However, this amendment is yet to be notified.


The above-referred amendment to the intermediary Rules has not been notified yet. Social media platforms have a huge role in present-day society. They act as a communication channel, as a meeting place, chat room, file sharing site as well as news disseminating avenue. It is like a publishing house as well as a distributor for news items. It also acts as a meeting point similar to teashops or pub gathering or getting together for conversations at public squares that happened in the previous century. Since it works as a communication medium, it is in a way replacing telephone and postal authorities. In addition to all of these, goods are displayed and targeted advertisements are shown on these platforms. Many of the social media platforms are integrated with digital payment mechanisms whereby money transactions can be done by users. In short, the role and business of social media extend to many strictly regulated segments like media and publications, telecom, postage, finance and banking.


In social media, if someone misbehaves or does any nuisance or disturbance or indulges in any undesirable behaviours the repercussions are much higher as it reaches to many and gets circulated massively. Hence it is highly required to regulate this outlet.      


No one is allowed to operate telecom services without approval from the concerned department. There are strict rules for publications of books and other periodicals and for newspapers. Similarly, the release of a cinematographic movie is also regulated. Hence all these activities if done in a digital medium should also be regulated in a similar manner as done in the traditional media or more stringently. Since the users of the digital platforms are not organised bodies, regulating the mass is difficult to implement and would amount to unnecessary discomfort to the public at large. The traceability of the user, the ability to anonymise the user identity makes things more complex.  Naturally, the best solution is to regulate the platform that enables the public to do these activities.


India ensures its citizen free speech and expression rights under Article 19 of the constitution. It can be abridged only on very select grounds listed in Art 19 (2). The right to have a political opinion is paramount. The majority of the comments received to this amendment proposal indicate the popular view is that the government should not have the freedom to dictate and ask for striking down the content by the platforms. I am uncertain, in what form the final amendment will be made. At the same time, I am sure there will be a unanimous view that restricting social media on sharing political comments would be disastrous for the country and its democracy. All political parties and the political views should be freely made in the social platforms because it helps in creating public opinion. The existing intermediary rules or even the proposed amendment do not attempt to interfere in the political comments of the users and it is in the right direction.


Undoubtedly, Social media platforms are so powerful in every respect that the revenues of some of these platforms are bigger than the annual budget of many developed countries. These platforms became super sovereigns beyond the control of any one government. This excessive control and dominance made them realise their responsibilities and they have voluntarily developed their own ethics and controls on the activities happening in their platforms. Many of those control is not based on any legal requirement and this self-regulation made some order to the activities of these platforms. All the present digital media platforms have created a content moderation system. They have named these activities differently, for instance, community standards for Facebook, community guidelines for YouTube, rules and policies for Twitter, LinkedIn professional community policies for LinkedIn, etc.


These platforms have also facilitated the users to raise concerns about the content and request to strike down if any content is offensive. These platforms have appointed a large number of people as moderators to address these complaints. They have deployed artificial intelligence for the early detection of harmful activities on their platform. They have created a mechanism of appeal if moderators’ decisions are unsatisfactory. Some of these platforms have also made rules clarifying what is allowed and what is not allowed on their platform. For instance, platforms like Facebook, Twitter, YouTube have restricted and disallowed posts and contents related to any violence (like terrorism, organised hate), criminal activities (like mass murder), sensitive posts and objectionable content (like hatred ness towards a community, caste, religion or protected or indigenous people, cyberbullying, child abuse, etc), content that encourages suicide or self-injury, misrepresentation, misleading, spam and false information, etc. If a user violates any of their content moderation rules, they will remove the content and if the case is severe, they may report to Law enforcement agencies. These self-regulating efforts are all made voluntarily by these companies and it deserves appreciation.


The proposed amendments to the intermediary rules also will not bring as much clarity on how these platforms are expected to behave, what content is to be allowed, what requires to be regulated and restricted as it gives enough room for subjectivity. In the absence of clear legislative prescriptions, these companies are considered free to device their own processes guided by their own logic and morals.   The legislators of other countries have already initiated and suggested regulations on digital platforms. For example, the European Commission Recommendation of 1.3.2018 on measures to effectively tackle illegal content online (2018) and similarly in the UK they are bringing a “duty of care” regime imposing certain duties on digital platforms.  Emulating from these legislative initiatives and keeping in mind the domestic requirements, intermediary rules should be amended expeditiously. All the onus of non-user generated content should be on social media platforms, as they are directly responsible for those content. However, for user-generated data, the amendment should not bring any additional restriction to free speech rights other than those mentioned in Article 19 (2) of the constitution. In any instance, political speech, comments, criticisms and views should not be curtailed. 


Social media platforms should not be bestowed with an umpiring power on judging the appropriateness of political views and expressions. If followers of one political party or followers of one ideology, make more posts than the opponent makes, social media should not be blamed for it rather people with opposite views should also be active in the social media to counter and establish alternate viewpoints. 


Social media should remain as a platform free political discussion and debate. The term “hate speech” should be narrowly construed and it should mean only those aspects allowed to be restricted under Article 19 (2) of the constitution.   With regard to “fake news” duty of care could be imposed on the platforms to have appropriate AI systems to prevent the spread of fake news. However, it should be kept in mind that the detection of fake news is technically challenging and strict liability on the platform for failure to detect fake news is impractical. Without creating a balanced regulation for them to follow, simply accusing social media would be inappropriate.