Organizations Must Consider Privacy Harms

The expanding use of smart gadgets in the Internet of Things (IoT) is creating many more privacy risks than ever before encountered. Many businesses are also (finally!) starting to address privacy. And interest in how to establish privacy programs and how to perform privacy impact assessments (PIAs) to identify privacy risks are increasing. The privacy risks to the business that can occur include such things as:

  • Breaches resulting in civil suits costing a lot of money in the final judgments
  • Non-compliance fines for not appropriately addressing privacy, which could result in large fines and decades of regulatory scrutiny in final consent decrees
  • Loss of customer trust, resulting in lost business
  • Devalued brand

However, organizations must now also move into new territory and consider privacy harms. Privacy harms are those bad things that could impact the individuals associated with personal information; they are outside the corporate border and typically have no direct negative impact to the organization itself. Here are just a few examples of privacy harms, as identified by the current National Institute of Standards and Technology (NIST) Privacy Engineering initiative:

  • Loss of Trust
  • Loss of Self Determination
    • Loss of Autonomy
    • Exclusion
    • Loss of Liberty
    • Physical Harm
  • Discrimination
    • Stigmatization
    • Power Imbalance
  • Economic Loss

Addressing privacy requires privacy harms consideration in addition to identifying security and privacy risks. This is a huge change for organizations, but most of those doing information security risk assessments aren’t used to the new need to look outward from the business to consider privacy harms to associated individuals (customers, patients, etc.). When they do not do this, privacy cannot be effectively considered and mitigated. It is important for organizations to incorporate consideration of privacy risk into their privacy impact assessments (PIAs) and other risk management activities.

Case Study

Let’s take the following scenario and run through some of the types of privacy harms that could occur to the individuals involved.

Your organization is a toy manufacturer. You are creating a new type of doll that will be leaps and bounds more evolved than the old types of dolls that talk when pulling a string. This new doll will be smart; built to interact with the children playing with them, and connected within the Internet of Things. A smart toy that will provide a way for the toy to learn from the child playing with the doll and make the doll respond to the child in ways never before possible…making the child playing with it consider the doll to be his or her best friend! These toys are going to fly off the shelves from being so popular, and will revolutionize the doll industry!!! And not only that, your organization will collect a lot of great data to determine children’s likes, dislikes, trends, and another way to market our other toys through the suggestions of each child’s new trusted friend. Children will be delighted, we’ll increase sales; a win-win for everyone!

Here are the high-level specs for the doll:

  • The doll will record everything seen and heard around it.
  • The recording will be transmitted via Wi-Fi connection and stored in the organization’s cloud with all the other recordings of all the other dolls that have been sold.
  • Big data analysis will help each doll learn about the child that plays with it.
  • The doll will be able to talk with the child using phrases and words customized to how the child talks.
  • The doll will be able to ask the child questions to learn the child’s likes and dislikes, where the child lives, the child’s activities, and other types of information collected.
  • The longer the child has the doll, the more customized the conversations can be with each child using more and more information.
  • The doll will be able to suggest activities, food, movies, toys, and an unlimited number of other things to the child.
  • The child’s legal guardians can be given access to the own child’s recordings if they want to know what the child is talking to the doll about.
  • The doll can recognize the child’s voice, so it will only talk with the child, and not divulge any secrets, or skew it’s leaning about the child, if a different child talks to it.

Brilliant! So, what could go wrong? Well, many things. For now let’s consider just a few of the potential privacy harms.

  1. Loss of Trust: If the information for each child is given or sold to others (e.g., marketing companies, schools, insurance companies, mental health organizations, law enforcement, etc.)
  2. Loss of Self Determination
    1. Loss of Autonomy: The algorithm could be created in such a way that the doll may suggest activities for the child that the child would not have done otherwise, or could put the child in harm’s way.
    2. Physical Harm: If someone hacked into the system controlling what the doll says, the hacker could make the doll tell the child to do something harmful. Can you imagine if a hacker made the doll say something like, “Suzy, go find your parents’ gun!”

It could be the stuff of nightmares. The organization needs to establish controls to mitigate the possibility of these privacy harms. And I didn’t even touch upon privacy risks and information security risks.

Whenever considering any type of new product that interacts with users and collects information from them, organizations must consider privacy harms, and then controls need to be implemented to mitigate them.


This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. For more on these topics, visit Dell’s thought leadership site PowerMore. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies.

dell_blue_rgb

Tags: , , , , , , , , , , , , , , , , , , , , ,

Leave a Reply