The November 2019 reports of the Internet and Mobile Association of India suggests that India has 504 million internet users. According to the report, around 227 million of them are rural Indians, and about 205 million are from the urban areas of the country. The research also pointed that there are 21 percent more women online now, in addition to about 71 million children aged between 5-12.
All of these data points can help derive various trends for marketers and digital agencies, however none is as important than the fact that the Internet can be a vulnerable place specially for women and our little ones in this country. The recent ‘bois locker room‘ incident is a testament to that.
The industry has failed to read the lines between these data points and understand that there is a need for tech platforms to be responsible and build moderation capabilities, especially in a country like India where certain segments of our society remain vulnerable.
With majority of digital traffic originating from rural India, our digital behaviour is not the most evolved. We are still young in the digital space-leading to content that gets generated which are sensitive in nature and needs strict moderation. Both users and brands have already been compromised in the past and there is evidence for anyone to Google.
Ad-tech platforms, digital partners, agencies need to be more adept in equipping themselves with skills and measures that will bring more accountability. While few businesses have incorporated ad-fraud solutions that counter fake news and advertising, such solutions should perhaps form broader policies that include brand and user safety measures that more actively blacklist platforms.
More and more brands are now using digital video content to advertise, educate, and entertain. Coupled with the rise of short video content and other ad-tech platforms that are largely pivotal on user generated content, brands are even more at risk to compromise their safety for the large numbers that some these platforms have to offer.
Users are at even greater risk because there are no policies or guidelines that direct/ guide a creator in deriving engagement with meaningful content that does not hurt sentiments. A constant barrage of a monoculture which sometimes objectifies women needs to be curbed.
Whether it is the ad tech platforms, advertisers, or users, everyone must be collectively responsible for the kind of content that is out there. Users, especially in a country like India, must be guided on what may not be harmful-considering children and housewives will be the new Internet adopters in the next year or two.
Further, content has been a key driver as nine out of 10 users accessed the Internet daily for entertainment and communication. And with the coronavirus pandemic putting majority of India on a forced vacay, leaving users enough time to consume more, there is an urgent need to put significant moderation around content
While India remains a hot destination for brands to launch and flourish, it also in many ways is far more challenging, given the nature of sensitivities that needs to be considered. Be it mis-leading political commentary, sexist perspectives, crime, animal cruelty, child pornography-none of this can be served to the young audiences that form the majority of the Internet users. In addition, these also puts at risk the millions advertisers set aside for their brand campaigns.
Considering digital advertising is growing at more than 30 percent year on year, a sharp eye on objectionable content and building tech to filter them will only build a sustainable growth environment.
In a world of evolving content, the only way to understand true context is to dive deep and analyse content on platforms. While focus must remain on optimising marketing efforts a sharp eye on the content, even at a page level will be critical to protect a brands reputation and user interest. No matter what the brand guidelines are, reputation is likely to suffer with blind spots, unless the brand has their ad-tech partners commitment to offer an environment that is free from unsafe content.
While there are several who are using Machine Learning and Artificial intelligence, in a country which has diverse sensitivities, human moderation must act as a final checkpoint for all content. Safety for brands as well as users needs to be a strong guarantee in India-not a long-drawn process. Considering that we live in an extremely tech savvy environment, safety for all in the ecosystem should not even be a conversation–It should be a given.
There is a need to create an alliance between India’s leading brands, ad-tech platforms, and leading agencies, trade bodies to address the need to stop the spread of harmful content, similar to what Unilever and few other brands had done in the past with the objective to form strong policies. It is important that we all, as a part of this fast-growing digital environment needs to put collaborative efforts to put brakes on this and also assist our users to develop their digital behaviour through responsible communication.
The digital ecosystem will continue to see rapid growth, perhaps even more with this pandemic, however it is now upon us in the business to join hands, integrate and build an environment that pollutes less and delivers more in value. The time is now and it cannot depend on Government intervention.
Sunil Nair is CEO, Firework India – a distributed short video content network focussed on delivering professionally user generated brand safe content.
Disclaimer: The opinions expressed within this article are the personal opinions of the author. NDTV is not responsible for the accuracy, completeness, suitability, or validity of any information on this article. All information is provided on an as-is basis. The information, facts or opinions appearing in the article do not reflect the views of NDTV and NDTV does not assume any responsibility or liability for the same.