Privacy Protection -- It’s all fun and games until someone loses their ID

If one is to believe the hypothesis of Dr. Solove, all is not lost; protection of privacy is still possible, even in today’s time of social media and overshare culture. As outlined in the Catsouras story, legal avenues are scarce and typically ineffective. That leaves it up to the individual to be proactive is safeguarding their private, personal information. A few guidelines to keep in mind to help fortify your personal information defense:

  • “Be careful what you share …” – even President Obama touched on the subject in his address to students in September 2009, warning that “whatever you do, it will be pulled up again later somewhere in your life” (Bradley, 2009, n.p.)
  • Be mindful of two key fundamentals – “Remember who your friends are, and know that a friend of a friend can be an enemy” (Bradley, 2009, n.p.)
  • keep your audience in mind when writing posts, status updates and tweets – “more and more these days, we hear stories of people who have forgotten that their boss is part of their network and have said things online that have gotten them reprimanded, even fired” (leading to the creation of the term ‘Facebook fired’) (Bradley, 2009, n.p.)
  • post with one rule in mind -- “Don’t ever post anything online that you aren’t comfortable with everyone seeing, because eventually they probably will” (Bradley, 2009, n.p.) – keep your pictures private and watch what you say
  • make use of the tools available to you, setting privacy controls where you can
  • Safeguard your social security number, as well as other information that can be used to identify you (actual address, school, birthday, etc.)
  • Know where your information is going – is it going to be shared with ‘trusted partners’ (third parties), and if so, how secure are they? (Alban, 2009, n.p.)

While there may not be many paths of legal recourse when it comes to protecting privacy, and the individual user can only do so much, there are technologies available and in development for the near future that could go a long way toward stronger privacy protection. One of those is the science of cryptography. The available cryptography protocols can be categorized into five overarching categories:
1. Secure function evaluation (SFE)
2. Encryption
3. Authentication
4. Anonymous channels
5. Anonymous authorization (a special case of zero-knowledge proof, where a user can prove to another that something is true without revealing what the proof is) (Lysyanskaya, 2008, p. 90-91)

In addition to the technology of cryptography, developments in the arena of biometrics -- the automated recognition of people via distinctive anatomical and behavioral traits -- are working toward making the future safe for privacy.
“ [B]iometric traits are profoundly more difficult to forge, copy, share misplace or guess. … Biometric systems require traits with two basic features: they must be unique for each person, and they must not change significantly with time” (Jain & Pankanti, 2008, p. 78). Three of the most popular traits used in biometric systems are fingerprints, face and iris.
  • Fingerprints have been used for over 100 years in law enforcement as a way to identify individuals. Sensors and readers are cheap and compact now, making it easier to take prints as a form of identification. The drawback to the smaller sensors is higher error rates since only a portion of the print is actually read (Jain & Pankanti, 2008, p. 79-80).
  • Using the face as the key for a biometric system makes use of the ubiquitous cameras built-in to our computers and cell phones, enabling more areas of to be protected. The drawback is high accuracy only when the image is taken in a controlled environment – same light, angle toward camera, no facial expression (Jain & Pankanti, 2008, p. 80).
  • Use of the iris is often favored given its high accuracy and speed. The iris is scanned and the identification is “done by comparing a person’s bit sequence to the sequences in a database” (Jain & Pankanti, 2008, p. 80-81). The drawback is using the iris is problematic, since the system depends on “algorithms that represent the random patters in the iris as a sequence of bits – no known human experts can determine whether or not two iris images match” (Jain & Pankanti, 2008, p. 81).

On the surface, biometric security seems like the ideal solution – no passwords to forget, nothing to hack, no PIN to memorize and the key to unlock the security is something unique to each person and is with them always. Yet, the dirty little secret of biometric systems is the entire process is based on the concept of the “imperfect match” (Jain & Pankanti, 2008, p. 81). Systems have to make the call to accept or reject authorization based on how closely what is presented matches what is on file. Depending on the threshold of the system, the errors of “false accept” and “false reject” (Jain & Pankanti, 2008, p. 81) undermine the effectiveness of the protection. The possible solution being employed is reading multiple biometric traits – scanning all ten prints instead of just one; using the prints, the face and the iris in tandem to make the identification.
The use of such personal and unique traits as identifiers raises the privacy red flags. With such a large collection of data, the concern over “who owns the data -- the individual or the service providers” (Jain & Pankanti, 2008, p. 81) is front and center on people’s minds. In addition to ownership, the ever-present possibility of misuse of the data or a purpose beyond its original intent raises eyebrows when it comes to using biometric systems.
“Biometric systems of the future will probably operate unobtrusively, capturing biometric traits without the active involvement of the user. Such stealth further confounds the privacy issue” (Jain & Pankanti, 2008, p. 81). The privacy issue is so nuanced, chock-full of shades of grey that even the solutions have problems.