Clock stops ticking for TikTok; time for 3rd party apps to take responsibility?
Video hosting and sharing platform, TikTok has been banned from the Google Playstore and Apple Store after an order by the Madras High Court on April 3rd 2019. Several political pressure groups and conservative parties had brought to the attention of the court that TikTok was encouraging pornographic content, violence and made minors vulnerable to sexual predators.
Apple and Google have complied with the high court’s orders to block downloads for TikTok on their platforms.
Last year, TikTok had announced that it reached 500 million monthly active users on its platform. Since then it has seen an upward surge both in popularity and controversy. This is not the first time the app has been called out for hosting inappropriate content. In China, Tencent banned WeChat owners from sharing external links to short video content especially on TikTok flagging them as inappropriate. Pakistan and Indonesia have outright banned TikTok citing the app as a bad influence.
Is the platform at fault here for allowing content sharing to run unchecked?
Ahmed Aftab Naqvi, CEO & Co-Founder, Gozoop said, “Technology is never the problem, but how technology is used can be good or bad. Platforms definitely need to take measures to control offensive/vulgar content. I see this impacting brand campaigns on TikTok in the short term. We were also planning TikTok campaigns for our clients at Gozoop, but right now we will be re-evaluating them. TikTok is a fantastic platform for engagement with highly engaged audience. We are expecting campaigns to restart, as soon as the platform abides by the law.”
To be fair, TikTok is not the first application to come under the scanner for sharing harmful content. WhatsApp has been feeling the heat from the government for quite some time to curb the fake news that is allowed to spread on its network.
According to Heena Tickoo, Director Client Servicing India, DCMN, “It has been a long debate and social media platforms need to take responsibility for content that is hosted on their platforms. Tech companies are big enough and make enough money that they should be held accountable for the information they carry. They must be liable for the actions that they allow other people to do and for the profits that one makes off such actions. TikTok has been facing regulatory issues in many countries now and I’m sure a complete ban in India would raise red flags among all other players too, who would start taking content regulation more seriously.”
What is remarkable is the government’s swift response to the issue. What implication does this hold for other third-party apps that skirt responsibility for hosting harmful content?
Ashish Patkar, Founder and CEO, Monk Media Network: The implications are clear, it's your platform and you're accountable for every piece of content that goes up there. The moment your content starts causing pain and trouble to people, regardless if it's out of their own doing, you're going down.
Tickoo opines, “Doesn’t look like it will affect the current users for now since existing users can still access the app. Hopefully, it might discourage content that promotes pornography or hypersexuality of kids, at least in the interim.
Patkar added, “While there have been so many reported incidents while using the app and the decision to remove the app across operating systems, how will it really help? The app provided a platform that let anyone be a creator and create content, while some tried to go beyond regular and do stuff to gather more followers & likes and ended up badly injured or even lost their lives, you can't really blame the app for it can you? For the 120 million users already using the app, people who have gained a significant number of followers would surely look for different outlets to push their content and continue to build a larger fan base.”