Today, with the rapid development of science and technology, there are more and more application scenarios of face recognition, which brings endless convenience to people. However, the mixed quality of recognition technologies also causes the risk of information leakage. In this regard, regulatory authorities and industry organizations should make a difference and formulate technical standards for face recognition. For example, the identification program is required to reach a certain number of feature points before it can be listed, so as to prevent simple identification software from filling in the blanks.
In addition to the strengthening of face recognition technology, it is also necessary to store and manage facial information. Stealing face? A crucial step. If facial photos used to mainly involve the portrait and privacy of the owner, then now facial photos are almost a master key, through which you can open the door and open the bank account. Leaking facial photos is as dangerous as leaking passwords.
In the era of social network, people should be wary of the abuse of their facial information. Like the face photo transaction discovered by media investigation, criminals do not need too clever means to obtain face photos. As long as they master the simple crawling technology, they can get a large number of photos on social networking sites, and almost all of these photos are voluntarily published by users on open platforms. If we analyze the data a little, the photos can correspond to my identity and all kinds of information, thus aggravating the hidden danger of information leakage.
Establish an all-round supervision of face recognition technology and authorized use of face data as soon as possible, including the overlord clause prohibiting enterprises from forcing users to provide data such as faces when downloading apps? Face change? Rectify and clean up the software, and at the same time manage the relevant contracts for enterprises to obtain user face data authorization. Enterprises shall not be allowed to use information asymmetry or contract text traps to make users sign unequal clauses that face data are used by enterprises at will, further increasing the illegal cost of unauthorized use of face data and damaging citizens' privacy and other rights and interests.