Why in news?
The central government has recently released the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
What is the objective?
- The guidelines aim to regulate social media, digital news media, and over-the-top (OTT) content providers.
- They were released following the instructions from the Supreme Court and the concerns raised in Parliament about social media abuse.
- The government wanted to create a level playing field in regulating online news and media platforms vis-à-vis traditional media outlets.
- The Rules also seek to -
- empower the ordinary users of digital platforms to seek redressal for their grievances
- command accountability in case of infringement of users’ rights
- The guidelines related to social media will be administered by the Ministry of Electronics and IT.
- The Digital Media Ethics Code relating to Digital Media and OTT Platforms will be administered by the Ministry of Information and Broadcasting.
Why now?
- The government had been working on these guidelines for over 3 years.
- The immediate push came in the form of the violent incidents at the Red Fort on January 26, 2021.
- Following this, the government and Twitter had disagreements over the removal of certain accounts from the social media platform.
What are the key provisions related to social media?
- Social Media Intermediaries - Social media intermediaries are platforms that host user-generated content.
- E.g. Twitter, Facebook, YouTube, WhatsApp
- The Rules create two Categories of Social Media Intermediaries which are:
- social media intermediaries
- significant social media intermediaries
- This is to encourage innovations and enable growth of new social media intermediaries without subjecting smaller platforms to significant compliance requirement.
- The distinction is based on the number of users on the social media platform.
- Government is empowered to notify the threshold of user base for these categories.
- The Rules require the ‘significant social media intermediaries’ to follow certain additional due diligence.
- Due diligence - Section 79 of the IT Act provides a “safe harbour” to social media intermediaries.
- It exempts them from liability for the actions of users if they adhere to government-prescribed guidelines.
- The new guidelines prescribe an element of due diligence to be followed by the intermediary.
- Failing this would mean that their safe harbour provisions would cease to apply.
- Grievance redressal - The Rules mandates that the intermediaries, including social media platforms, should establish a mechanism for receiving and resolving complaints from users.
- These platforms will need to appoint a grievance officer to deal with such complaints.
- The officer must acknowledge the complaint within 24 hours, and resolve it within 15 days of receipt.
- In addition to a grievance officer, social media platforms will have to appoint a chief compliance officer resident in India.
- The chief compliance officer will be responsible for ensuring compliance with the rules.
- The platforms will also be required to appoint a nodal contact person for 24×7 coordination with law enforcement agencies.
- Further, the platforms will need to publish a monthly compliance report.
- This should have details of -
- complaints received and action taken on the complaints
- contents removed proactively by the significant social media intermediary
- The due diligence requirements will come into effect after 3 months from the notification of the rules.
- Removal of content - The rules lay down 10 categories of content that the social media platform should not host.
- These include content that –
- threatens the unity, integrity, defence, security or sovereignty of India
- threatens friendly relations with foreign States, or public order
- causes incitement to the commission of any cognizable offence
- prevents investigation of any offence
- insults any foreign State
- is defamatory, obscene, pornographic, paedophilic, invasive of another’s privacy, including bodily privacy
- insults or harasses on the basis of gender
- is libellous, racially or ethnically objectionable
- is relating to or encouraging money laundering or gambling
- is otherwise inconsistent with or contrary to the laws of India, etc
- Court or the appropriate government agency may intimate the platform of hosting prohibited content.
- Upon receipt of such information, the platform should remove the said content within 36 hours.
- Penalties for violation - In case an intermediary fails to observe the rules, it would lose the safe harbour, and will be liable for punishment.
- This will be “under any law for the time being in force including the provisions of the IT Act and the Indian Penal Code”.
- The offences under the IT Act include, among others, -
- tampering with documents
- hacking into computer systems
- online misrepresentation
- breach of confidentiality and privacy
- publication of content for fraudulent purposes
- The penal provisions vary from imprisonment for 3 years to a maximum of 7 years, with fines starting from Rs 2 lakh.
What are the key provisions on Digital Media and OTT Platforms?
- The Digital Media Ethics Code prescribes the guidelines to be followed by OTT platforms and online news and digital media entities.
- OTT services - For OTT service providers, the government has prescribed self-classification of content into five categories based on age suitability.
- U - Online curated content that is suitable for children and for people of all ages
- U/A 7+ - Content that is suitable for persons aged 7 years and above, and which can be viewed by a person under the age of 7 years with parental guidance
- U/A 13+ - Content that is suitable for persons aged 13 years and above, and can be viewed by a person under the age of 13 years with parental guidance
- U/A 16+ - Content which is suitable for persons aged 16 years and above, and can be viewed by a person under the age of 16 years with parental guidance
- A - Online curated content which is restricted to adults
- Platforms would be required to implement parental locks for content classified as U/A 13+ or higher.
- There also has to be reliable age verification mechanisms for content that is classified as “A”.
- News platforms - The publishers of news on digital media would be required to observe -
- Norms of Journalistic Conduct of the Press Council of India
- the Programme Code under the Cable Television Networks Regulation Act
- The Rules thereby provide a level playing field between the offline (Print, TV) and digital media.
- Grievance redressal mechanism - A three-level grievance redressal mechanism has been established with different levels of self-regulation.
- Level-I: Self-regulation by the publishers
- Level-II: Self-regulation by the self-regulating bodies of the publishers
- Level-III: Oversight mechanism
- Self-regulation by the Publisher: The publisher shall appoint a Grievance Redressal Officer based in India.
- The officer shall take decision on every grievance received within 15 days.
- Self-Regulatory Body: There may be one or more self-regulatory bodies of publishers.
- Such a body shall be headed by a retired judge of the Supreme Court/High Court or an independent eminent person.
- It shall have not more than six members.
- Such a body will have to register with the Ministry of Information and Broadcasting.
- This body will oversee the adherence by the publisher to the Code of Ethics.
- It will also address grievances that have not be been resolved by the publisher within 15 days.
- Oversight Mechanism: The Ministry of Information and Broadcasting shall formulate an oversight mechanism.
- It shall publish a charter for self-regulating bodies, including Codes of Practices.
- It shall also establish an Inter-Departmental Committee for hearing grievances.
Source: The Indian Express, PIB