Facebook and Google will have to deal with a new UK law soon. Companies will be fined if they do not quickly remove certain harmful content on their platforms.
The law refers to content that encourages child terrorism, abuse and exploitation. This is yet another example of a government falling on top of technology companies, something we have seen happening in recent months.
The UK government claims that Facebook is not simply a content distributor but a 'media empire'. This law intends that corporate directors are liable if content is not removed at a specified time.
UK is putting pressure on Google and Facebook
These measures also come to combat misinformation and so-called fake news or possible interference in elections. Certainly, the approval of Articles 13 and 11 will reinforce these measures.
Reports indicate that this pressure from the UK on online content was caused by the plight of Molly Russell, a 14 year old. The young woman committed suicide in 2017 after allegedly viewing suicide-related content on the internet.
The terrorist attack in New Zealand and its transmission also motivated UK law to create a new law. Apparently, the UK government does not believe that companies do enough to protect their users.
The new proposal aims to protect UK citizens from potential harmful content. In addition, it aims to ensure that companies do not shirk their responsibilities.
Facebook will not be the only one affected by this new law. Search engines like Google and online messaging services will also be regulated by the new law.
In conclusion, we can interpret this law as a filtering tool for harmful content. However, for children, this responsibility lies with the parents, not the government.
EBox editors recommend: