If my sextape is uploaded, do they know I’m a dog? |
Abstract
Face recognition is an emotional topic: from accuracy failures revealed in LON to rejection in SFO; from desire for privacy to policing needs; from obligation to protect children and assist drivers to accidental or intentional exposure and disclosure. A seemingly never-ending array of systems and services are capturing and pushing images of our faces into the cloud. Got an outdoor, Internet-enabled camera? Then why not mount it on every lamppost for more comprehensive policing or install it in schools to pick out rogue visitors? Couple the need many organizations have to keep facial images with continuous advancement in ingesting data from disparate and seemingly incongruous sources, managing hundreds of terabytes of data, discovering non-obvious links and detecting hidden patterns, and we may find that the word privacy will soon be added to the list of obsolete words. Recruiters have long understood the value of analyzing broad sets of data from multiple sources, ranging from open-sources such as social networks and, as required, deep- and dark-web sources, both to screen and run background checks on candidates, uncover non-obvious connections and relationships, and flag issues. Add to the mix the ability to include data based on your face and we may find that none of us are employable. Security has always been and remains a ballet between requirements, threats, usability and budgets. And, it is typically reactive, lagging innovation. Which brings us to porn. Imagine using face recognition to match social-media photos with sexted images or nonconsensual porn videos uploaded by ex-partners? With names of “stars” revealed, the consequences of such sexual privacy violations would be extreme. The history of the Internet is replete with stories of anonymity and its positive value to society. In the early and uncertain days of the AIDS virus, the anonymity offered by the Cleveland Freenet provided a virtual safe place for sharing and support. It was truly a time when, on the Internet, nobody knew you were a dog. Today, when every database is destined to be breached, databases of images all but ensure that everyone will know you are a dog. While laws make it illegal to scrape data to figure out if someone once appeared in porn, technology is needed to make it hard. In this session we will step through both the use cases and technology driving anonymity erosion, relevant policy issues from multiple perspectives, and consider proposed ideas for response. Speaker Bio: More recently, Ron was CEO of Israeli start-up, BioCatch, an innovator in cognitive biometrics and continuous authentication, and launched TrueBit Cyber Partners to provide truth, clarity and confidence in cybersecurity investments and acquisitions. He has been cybersecurity venture partner with Jerusalem-based equity crowdfunding firm OurCrowd since 2012 and is in Melbourne as an entrepreneur-in-residence with Australia’s cybersecurity accelerator, CyRise. |