Klaus

Fawkes

 Wed, 26 Aug 2020 21:27:30 +0200 
#^Fawkes
Image/photo
So how do we protect ourselves against unauthorized third parties building facial recognition models that recognize us wherever we may go? Regulations can and will help restrict the use of machine learning by public companies but will have negligible impact on private organizations, individuals, or even other nation states with similar goals.

The SAND Lab at University of Chicago has developed Fawkes1, an algorithm and software tool (running locally on your computer) that gives individuals the ability to limit how unknown third parties can track them by building facial recognition models out of their publicly available photos. At a high level, Fawkes "poisons" models that try to learn what you look like, by putting hidden changes into your photos, and using them as Trojan horses to deliver that poison to any facial recognition models of you. Fawkes takes your personal images and makes tiny, pixel-level changes that are invisible to the human eye, in a process we call image cloaking. You can then use these "cloaked" photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo. The difference, however, is that if and when someone tries to use these photos to build a facial recognition model, "cloaked" images will teach the model an highly distorted version of what makes you look like you. The cloak effect is not easily detectable by humans or machines and will not cause errors in model training. However, when someone tries to identify you by presenting an unaltered, "uncloaked" image of you (e.g. a photo taken in public) to the model, the model will fail to recognize you.
Haakon Meland Eriksen (Parlementum)
 Wed, 26 Aug 2020 22:23:08 +0200 
I feel unjustly responsible having put a similar idea of hiding in plain sight here perhaps six to eight years ago, but my idea was to put an encrypted message in slightly altered pixel values. :-D