In a metaverse where everyone resides on the web, privacy and security are crucial. Zuckerberg’s Facebook started using a very novel trick before it changed its name to “Meta” – asking users to upload nude photos.
Of course, there is nothing bad in this move, but rather the use of AI to avoid the disclosure of nude photos of users. The main principle of the technology is to use an algorithm to extract a unique “hash” of the nude photo and then match it so that if someone uploads it again, it will be identified and banned.
The so-called hash value, is through a certain hashing algorithm – photo is PDQ, video is MD5, the larger data is mapped to shorter data, this small data is the hash value of the big data, with uniqueness, can be used as the big data “identity card”.
In fact, this technology has been widely used in the industry, mainly for child protection applications, Microsoft, Google, Twitter, etc. have been deployed, and Apple has also carried out a controversial move to “scan users’ local albums.
Facebook emphasizes that the system will delete all nude photos and only keep the hash of the nude photos.
This method also has limitations, that is, large data changes, but also lead to small data changes, once the picture has been slightly P, the hash value will change, the algorithm will not recognize.