Carnegie Mellon’s Data Privacy Head Urges Development of New Privacy Technologies

I enjoy reading Scientific American Magazine. And I especially am interested in reading their articles that touch upon, or directly address, information security, privacy or compliance. It is always nice to see the views of practitioners, educators, researchers and others who are not on the typical information security circuit of publications.


So the article from the June 27, 2007 online issue, “Privacy Isn’t Dead, or At Least It Shouldn’t Be: A Q&A with Latanya Sweeney” caught my eye today.
Sweeney is a Carnegie Mellon computer scientist and head of the Data Privacy Lab at Carnegie Mellon University, and it is refreshing so see the opinions of a technology expert promote not only the need to build privacy into computer systems, but also to have a positive outlook on the topic! Sweeney makes some great, enlightened statements that more technologists, many of whom have stated there is no privacy anymore, need to make an attitude and thought change about.
The article wasn’t long, but Sweeney made some comments that stood out to me.

“With today’s technology, though, you basically get a record from birth to grave and there’s no forgiveness. And so as a result we need technology that will preserve our privacy.”

Yes, new technologies to preserve privacy are needed; few beyond educational institutions are doing research for truly new technologies. Vendors are basically just trying to use existing technologies, and are not doing so very successfully.
When asked why she created the Data Privacy Lab at Carnegie Mellon University, Sweeney responded:

“One day I was in grad school and [in my research] I came across this letter that roughly said: at the age of two the patient was sexually molested, at the age of three she stabbed her sister with scissors, at the age of four, her parents got divorced, at the age of five she set fire to her home. And then I realized there was nothing in that description that [would be changed by] scrubbing out identifiable information. I’ll bet you there’s only one person with that experience. And that made me realize that identifiability is really fascinating, and it made me realize that I didn’t understand a thing about privacy. Removing the explicit identifiers wasn’t what it was about. I realized there’s a lot more to this than a notion of what makes me identifiable. And it was then that I started realizing that privacy in the data space is a little bit different. It requires tracking people where they go. And when all this technology began exploding, you begin to realize that it’s huge. “

This supports and expands upon the issues of trust and privacy I talked about yesterday; just because you don’t include sensitive pieces of information online doesn’t mean you’re not giving away information that can be linked to a specific person and impact their privacy.
A little bit about Sweeney’s Identity Angel sofware:

“Your Identity Angel software is able to gather up disconnected pieces of information about people from data available all over the Internet. How does it work?
It is very easy to do scans for individuals from information that is publicly available or freely given away or sold for a cost. That means you don’t have to break into a system to get data you’re not supposed to have; it means you can gather the information from what is already out there. [Earlier in my career] I had learned that if I had the date of birth, gender and a five-digit zip code of a person, I could identify 87 percent of the people in the United States. So even if you don’t give me your social security number, I can find out who you are nearly nine out of 10 times. “

Yes, and it is so alarming that so many people are freely putting so much of this information out on the Internet about themselves…not realizing how it can be combined with other information to get more information about their lives that they have unknowingly facilitated.
When asked “how do we solve the privacy problem? What are the best and worst-case scenarios?

“My answer is that the privacy problems that I’ve seen are probably best solved by the person who first created the technology. What we really have to do is train engineers and computer scientists to design and build technologies in the right kind of way from the beginning. Normally, engineers and computers scientists get ideas for technologies on their own and engage in a kind of circular thinking and develop a prototype of their solution and then do some kind of testing. But we are saying we will give them tools that help them see who are the stakeholders and do a risk assessment, and then see what barriers will come up and deal with the riskiest problems and work to solve them in the technology design. I think if we are successful in producing a new breed of engineers and computer scientists, society will really benefit. The whole technology-dialectics thing is really aiming at how you should go about teaching engineers and computer scientists to think about user acceptance and social adoption [and also that they] have to think about barriers to technology [from the beginning]. So the best scenario is that this kind of training takes hold and as new technologies emerge they are less likely to be constantly clashing with accept-or-reject options. “

Yes! It is so important to build security and privacy into systems, applications, and other technologies right from the very beginning. This should always be part of the systems development lifecycle (SDLC). The mindset for creating technology must change; security and privacy must be considered as necessary integral components, like brakes are considered for cars. Security and privacy must stop being viewed as add-ons or optional.
I really liked Sweeney’s thoughts about the famous, “Privacy is dead. Get over it.” quote that is so often used by technologists and software vendors who don’t want to think about, let alone address, privacy and security issues.

“Several years ago, Scott McNealy, the CEO of Sun Microsystems, famously quipped, “Privacy is dead. Get over it.”
Oh privacy is definitely not dead. When people say you have to choose, it means they haven’t actually thought the problem through or they aren’t willing to accept the answer. Remember, it’s in [McNealy’s] interest to say that, because he very much shares that attitude of the computer scientist who built the technology that’s invasive; who says, “Well, you want the benefits of my technology, you’ll get over privacy”. It’s exactly the kind of computer scientist we don’t want to be graduating in the future. We want the computer scientist who will resolve these kinds of barriers in conflict, identify them and resolve them in their technology design.”

This is great stuff.
We need to start teaching our future computer scientists at a very young age about the need to integrate security and privacy; then we can break this cycle being perpetuated by current technology leaders unwavering in their viewpoint that say you can’t address privacy. It would be nice if the current technology leaders would realize you can create new privacy technologies and build in privacy and security, and some do, but more need to take action.

Tags: , , , , , , , , , , , , ,

Leave a Reply