An Australian programmer named Hoan Ton-That created an app that allowed people to put Donald Trump’s hair in their photos. The Australian subsequently raised the bar and created a facial recognition app, sending it to various authorities like police departments in Florida and the FBI.
The app, which belongs to your small company Clearview AI, allows you to take a photo of a person and compare it with a gigantic database, showing all the other photos that match the facial features of the original.
The truth is that according to the New York Times, the app has already helped authorities to solve several cases of property, identity, financial fraud and even child abuse theft.
Even so, the ethical question still remains: to what extent should the authorities have this type of tools? It is the same debate with Apple and the FBI, which continues to insist that Apple should create a “passkey” to unlock iPhones from criminals.
We are at a time when privacy is something we have to forget when we access the internet. However, this does not mean that it can be invaded by any individual. The growth of Deepfakes and the creation of fake content is one of the risks of A.A.
Sundar Pichai wants regulation in artificial intelligence
Google CEO Sundar Pichai published a document in the Financial Times to warn about the negative effects of artificial intelligence on society. Pichai says that companies must be careful in developing such technologies.
Sundar Pichai adds that the A.A. they cannot be regulated by companies, due to their financial interests. The CEO says it is up to governments to create regulations so that the A.A. are not used for more harmful purposes, or at least that such use is not encouraged.
European Union also wants to regulate the development of A.I.
It has recently been reported that the European Union wants to create a plan to make sure that artificial intelligence technologies are applied ethically. This plan may limit access to artificial intelligence by public or private companies.
EBox editors recommend: