header
headerbottom           Subscribe    Privacy Policy   October 2009
Spacer Image
Spacer Image
Greetings!


Last month we explored the privacy and trust considerations surrounding virtual health communities, online portals where people post their private medical information in hopes of gaining knowledge and helping others.
 
This month we turn to another online health paradigm that lives or dies by patient trust: electronic medical records (EMRs). President Obama asserts that, by 2014, the health records of all Americans will be stored electronically. Easy for him to say, writer Larry Dobrow found out when talking to those entrenched in the day-to-day realities of the U.S. healthcare industry, who cite patchwork state laws, interoperability issues, and politics as usual as monstrous impediments to bringing records online.
 
Protecting them once they are there is another can of worms, yet one that could make or break the success of EMRs. With this in mind, writes Dobrow, the government well could heed the approach of those in the private sector whose bottom line depends upon the trust of those who entrust in them their private health data. "Don't underestimate the role of public trust," says one.
 
Trust-building is the theme of our second article this month. In recent months we've explored different approaches to transparency and disclosure that can help or hinder trust-building efforts. The privacy policy is a place where that building (or breaking down) can begin, even though most of us--practitioners and consumers alike--probably don't think of it in those terms.
 
In "How to Read a Privacy Policy," the folks at the not-for-profit Common Data Project share the results of their intensive examination of 15 online privacy policies. Their insight sheds light on how such policies could be used to build trust and how, in an environment of trust, consumers might be more likely to loosen the leash on their valuable data.


Trevor Hughes Signature
J. Trevor Hughes, CIPP
Executive Director, IAPP



Drive to digital records begs due diligence

By Larry Dobrow

While making the case for his overhaul of the public healthcare system, President Obama has doted on one of its provisions: that all personal health records should be digitalized and melded into a single standard format within five years. Doing so, he believes, will reduce administrative headaches--one provider won't have to get on the phone with another to access a patient's hard-copy record--and thus bolster the overall quality of care. The medical community largely supports the digitalization push.

Thus in theory, computerizing health records makes all the sense in the world. In practice, however, the project will prove considerably more challenging.

Individuals who have participated in such projects caution that the digitalization of any large mass of sensitive data is an endeavor fraught with risk, especially from a privacy and security standpoint. They caution that creating a national privacy framework from the patchwork of laws and regulations currently governing the handling of this information could prove a political and practical migraine. And they say that the healthcare industry's rep for being slow to embrace new information technology isn't going to help matters much.

"Healthcare has traditionally been a laggard in this space," notes Steve Smerz, chief information officer and chief security officer of VisionShare, a firm which offers connectivity for healthcare entities. "In finance or even manufacturing, they've applied the technology and automated their businesses much more substantially than healthcare companies have. It's going to be hard to push things through."

Adds Lisa Gallagher, senior director of privacy and security for the Healthcare Information and Management Systems Society: "There's a tremendous amount of work that still needs to be done concerning the records. We're operating in an environment where there are 50 different [state] privacy policies. Electronic health records pose many difficult issues on the policy and implementation levels."
 
It makes sense, then, that government officials entrusted with this project would look towards private entities, whether healthcare-focused or not, whose bottom lines depend upon the vigorous protection of sensitive personal data. Before they do, however, they'd be wise to heed the lessons still being learned by their counterparts in other nations.
 
Skepticism about Canada's transition to computerized health records perhaps has been exacerbated by reports of health privacy breaches. Headlines such as "300K patient files on stolen laptops" and "Albertans' health records exposed to computer hacker" don't exactly inspire confidence. Australian authorities entrusted with the online records push have been slowed by many of the same issues, with the country's privacy commissioner scolding her peers for pursuing privacy quick-fixes. And the seemingly regular headlines about breaches of electronic records within Britain's National Health Service should give cause to pondering by U.S. authorities.
 
The problem, say some, is one of both lax supervision--many of the companies and institutions handling the digital health records lacked a chief privacy officer or comparable individual responsible for overseeing the process--and poor coordination among the myriad players in the healthcare system.

"You can't just say 'we're going to do this.' Don't understate the role of public trust. There has to be plenty of explanation," notes Andro Hsu, science and policy liaison for 23andMe, which seeks to help individuals better understand their genetic information. 23andMe has proposed legislation in California to clarify existing privacy regulations and upgrade privacy protections.

Smerz puts it more succinctly: "Transparency is the key. If people feel that they know what the government is doing and not doing with their data, they're more comfortable with it."

The government must also heed the approach of private firms when it comes to data minimization (malfeasants can't steal information that isn't collected) and a major-league education process for any/all government employees who will handle the information on even an occasional basis (lesson one: storage on laptops = bad). It wouldn't hurt to do more than merely encrypt the data; extensive data masking, for instance, would render decoding the information enormously challenging.

Possible roadblocks include the readiness of healthcare system players. "With some of my customers, I find that they don't even have a firewall up in their data environment," says Smerz. And, of course, there remains a vocal minority who believe that no matter what the government does, it will never truly be able to guarantee the sanctity of the information. "You hear more about the failures than you do the successes," Smerz adds.

As for when the digitalization effort will proceed, Smerz and Gallagher believe the timetable offered up by the Obama administration might be a tad optimistic. Gallagher notes that there are "so, so many players" involved, while Smerz anticipates a series of "small wins" before the big ones occur. "Four years is not obtainable," Smerz adds flatly.

Both, however, cite momentum towards a solution and a willingness from most of the players involved to hammer out a privacy-sensitive solution. "We're in a care-giving environment," Gallagher says. "It's a time and place where peoples' intentions are good. If that's the general culture, this whole process can be a really positive thing."
DPPW_banner
PRACTICAL WORKSHOP ADDRESSES GLOBAL PRIVACY CHALLENGES
Don't miss this chance to connect with our global privacy community and stay abreast of the latest solutions for managing today's most challenging data protection obstacles. The IAPP will bring together leading experts from around the world to share their insights and approaches to some of the most demanding operational issues and risks at the IAPP Data Protection and Privacy Workshop, November 3 in Madrid. Topics on the agenda include notice of security breach, global transfer solutions, cloud computing, online privacy and the future of the profession. See the complete program now.
Register Now

How to read a privacy policy

By The Common Data Project

As part of its mission to educate the public about online privacy issues, the Common Data Project embarked on a project to discover how companies compete on the basis of privacy. CDP surveyed the online privacy policies of large Internet companies, social networks and start-ups to develop a "how-to-read" guide for those unable to understand or unwilling to engage in the lengthy and complex privacy policies that are so common in today's marketplace. Using seven questions, the CDP set out to extract the information users need in order to understand the practices behind the policies and to participate in the public debate about online data collection.

The Common Data Project was created to encourage and enable the disclosure of personal data for public re-use through the creation of a technology and legal framework for anonymized data-sharing. Specifically, we think that means creating a new kind of institution called a datatrust, which is exactly what it sounds like: a trusted place to store and share sensitive, personal data.

So why did we spend a lot of time parsing the legalese of some excruciatingly long privacy statements?

We know that an easy-to-understand, clear-cut privacy policy is critical to the viability of a datatrust. And we felt the first step in figuring out what constitutes an easy-to-understand, clear-cut privacy policy would be to look at what privacy policies are promising today.

We realize that most users of online services have not and never will read the privacy policies so carefully crafted by teams of lawyers at large companies. And having read all of these documents (many times over), we're not convinced that anyone should read them, other than to confirm what they probably already know: A lot of data is being collected about them, and it's not really clear who gets to use that data, for what purpose, for how long, or whether any or all of it can eventually be connected back to them.

Yet people continue to use Google, Microsoft, Yahoo and other sites without giving much thought to the privacy implications of giving up their data to companies.

We at the Common Data Project know that for a datatrust to function properly, we can't rely on people to simply look the other way, nor do we want them to.

Data collection for Google and Microsoft users is incidental. People go to Google.com to search, not to give data. As long as they have a good search experience, the data collection is largely out of sight, out of mind.

A datatrust, on the other hand, will be a service explicitly designed around giving and sharing data. We know that to convince the public that the datatrust can indeed be trusted, a clear privacy story is absolutely necessary.

Below we offer a guided tour through the privacy policies of 15 online services--from established players to major retailers to Web 2.0 starlets and aspiring start-ups that hope to compete on superior privacy guarantees. Our goal was to identify where their policies were ambiguous or simply confusing.

Companies surveyed
The companies and organizations whose policies the CDP analyzed were chosen for being among the most trafficked sites, as well as for providing a range of services online.
Privacy is not exclusively an online issue, even though the companies surveyed here all operate online. Many of the largest data breaches over the last 10 years have involved companies and agencies that actually operate exclusively offline, and the question of how to manage, store, and share large amounts of information is an important question for almost every business today. However, we chose to focus on online businesses and organizations because they have been among the most visible in illustrating the dangers and advantages of amassing great quantities of data.

Here is a quick visual of how their respective privacy policies stack up next to each other, literally.

CommonDataProjectGraph

For larger scale graph visit www.commondataproject.org.

The analysis
To guide the analysis, we used seven questions to help pinpoint the issues deemed as most crucial for users' privacy.
  • What data collection is happening that is not covered by the privacy policy?
  • How do they define "personal information"?
  • What promises are being made about sharing information with third parties?
  • What is their data retention policy and what does it say about their commitment to privacy?
  • What privacy choices do they offer to the user?
  • What input do users have into changing to the policy's terms?
  • To what extent do they share the data they collect with users and the public?
1. What data collection is happening that is not covered by the privacy policy?
This first question might seem like an odd one. But the fact that there is data collection going on that's not covered by the "privacy policy" captures so much of what is confusing for users who are accustomed to the bricks-and-mortar world.

When you walk into your neighborhood grocery store, you might not be surprised that the owner is keeping track of what is popular, what is not, and what items people in the neighborhood seem to want. You would be surprised, though, if you found out that some of the people in the store who were asking questions of the customers didn't work for the grocery store. You would be especially surprised if you asked the grocery store owner about it, and he said, "Oh those people? I take no responsibility for what they do." But in the online world, that happens all the time. Obviously, when a user clicks on a link and leaves a site, he or she ends up subject to new rules. But even when a user doesn't leave a site, there's data collection by third-party advertisers that's happening while you sit there.

In this section, we review how companies handle this in their privacy policies. 

2. How do they define "personal information"?
Most privacy certification programs, such as Truste, require that the privacy policy identify what kinds of personally identifiable information (PII) are being collected. As a result, nearly every privacy policy we looked at included a long list of the types of information being collected.

In this section, we examine how companies approach this disclosure. Some companies categorize, others label, and others use the disclosure to tout the fact they collect no information whatsoever. This section also explores how companies approach the topic of re-anonymization.

3. What promises are being made about sharing information with third parties?
In addition to listing the types of data collected from you, most privacy policies will also list the reasons for doing so. The most common are:
  • To provide services, including customer service
  • To operate the site/ensure technical functioning of the site
  • To customize content and advertising
  • To conduct research to improve services and develop new services.
They also list the circumstances in which data is shared with third parties, the most common being:
  • To provide information to subsidiaries or partners that perform services for the company
  • To respond to subpoenas, court orders, or legal process, or otherwise comply with law
  • To enforce terms of service
  • To detect or prevent fraud
  • To protect the rights, property, or safety of the company, its users, or the public
  • Upon merger or acquisition
This section examines various companies' approaches to communicating this message. Language aimed at normalizing and euphemizing the information being relayed was common.

For us at CDP, the issue isn't whether IP addresses are included in the "personal information" category or not. What we really want to see are honest, meaningful promises about user privacy. We would like to see organizations offer choices to users about how specific pieces of data about them are stored and shared, rather than simply make broad promises about "personal information," as defined by that company.

It may turn out that "personal" and "anonymous" are categories that are so difficult to define, we'll have to come up with new terminology that is more descriptive and informative. Or companies will end up having to do what Wikipedia does: simply state that it "cannot guarantee that user information will remain private."

4. What is their data-retention policy and what does it say about their commitment to privacy?
Data retention has been a controversial issue for many years, with American companies not measuring up to the European Union's more stringent requirements. But for us, it obscures what's really at stake and often confuses consumers.


For many privacy advocates, limiting the amount of time data is stored reduces the risk of exposure. The theory, presumably, is that sensitive data is like toxic waste, and the less of it lying around, the better off we are. But that theory, as appealing as it is, doesn't address the fact that our new abilities to collect and store data are incredibly valuable, not just to major corporations, but to policymakers, researchers, and even the average citizen. Focusing on this issue of data retention hasn't necessarily led to better privacy protections. In fact, it may be distracting us from developing better solutions.

This section explores how some companies handle (or don't) the topic of data retention in their privacy policies.

5. What privacy choices do they offer to the user?
Over the last year or two, there have been some interesting changes in the way some companies view privacy. They're starting to understand that people not only care about whether the telemarketer calls them during dinner, but also whether that telemarketer already knows what they're eating for dinner.

This section explores the evolution and offers one option for taking it a significant step forward.

6. What input do users have into changes to the policy's terms?
Not surprisingly, none of the policies we looked at stated that users would have a say into changes to the privacy terms. In this section, we review what companies do say in this regard, and also shed light on how Facebook's recent handling of policy term changes did open the door for users to have a say, why this is good, and why they had to do it.

7. To what extent do they share the data they collect with users and the public?
When we started this privacy policy survey, our goal was simple: find out what these policies actually say. But our larger goal was to place the promises companies made about users' privacy in a larger context-how do these companies view data? Do they see it as something that wholly belongs to them? Ultimately, their attitude towards this data very much shapes their attitude towards user privacy.

This section discusses companies' data-sharing activities and what it means, or could mean, for consumers.

Conclusion
By our standards, none of the privacy policies we surveyed quite measure up. Most of them provide incomplete information on what "personal information" means. Many of them fail to make clear that they are actively sharing information with third-parties. Even when they change their policies on something like data retention to placate privacy advocates, the changes do little to provide real privacy. The legal right companies reserve to change their policies at any time reminds us that right now, the balance of power is clearly in their favor. When they do offer users choices, the choices fail to encompass all the ways online data collection implicates users' privacy.

But we don't believe that we are stuck with the status quo. In fact, there are many positive signs of companies making smart moves, because they're realizing they need buy-in from their users to survive in the long-term.

Already, during recent Facebook controversies, we've seen users trying to determine how their data is shared. Google has created new tools that allow users a wider range of choices for controlling how their data is tracked. And every day, we see new examples of how data can be shared with users and customers as part of a service, rather than being treated just as a byproduct that is solely for the companies' use and enrichment.

We hope that our analysis will help push debate in the right direction. We hope that companies will see there can be real value and return in being more honest with their consumers. At the same time, we hope that as consumers and privacy advocates, we can work with companies towards useful solutions that balance privacy rights against the value of data for all of us.

About the Common Data Project
The Common Data Project is new not-for-profit organization dedicated to changing the way we think about and use personal data. At the heart of its mission is its development of a new kind of institution called a "datatrust," a repository of sensitive information donated by institutions and individuals for public use. We are exploring organizational structures that will enable the datatrust to be structurally transparent and participatory, while also testing some exciting new technology that could drastically raise industry standards for what it means to guarantee "anonymization" of data. We are also taking what we've learned from our survey of existing privacy policies to explore the possibility of creating licenses for personal information.  Such licenses would allow individuals to release personal data and define their own terms of use, in the spirit of Creative Commons licenses for intellectual property. For more information on our work, please see our Web site at www.commondataproject.org and our blog at www.myplaceinthecrowd.org.


In This Issue...
Drive to digital records begs due diligence

How to read a privacy policy

Editorial Advisory Board
Don Peppers
Partner
Peppers & Rogers Group


Martha Rogers, Ph.D.
Partner
Peppers & Rogers Group


J. Trevor Hughes, CIPP
Executive Director
IAPP


Larry Ponemon, CIPP
Founder
The Ponemon Institute


Jonathan D. Avila, CIPP
Vice President - Counsel, Chief Privacy Officer
The Walt Disney Company 
Spacer Image
Spacer Image
Spacer Image
Research Update

Tailored Advertising

A study released this month reveals that 66 percent of American adults don't want tailored online ad content.

This first nationally representative study on behavioral targeting, conducted jointly by the University of California, Berkeley and the University of Pennsylvania, drills down on Americans' opinions about a variety of online marketing methods. One finding reveals that the majority (55%) of adults ages 18-24, although less likely than older adults to say no to targeted advertising, are opposed to the practice. This increases to 86 percent when they learn that  tailored ads are the result of the tracking of their Web activities on other sites.

Other findings:

·    47 percent of respondents believe that they have lost all control over how personal information is collected and used by companies.

·    62 percent of respondents erroneously believe that if a Web site has a privacy policy it means the site "cannot share information about you with other companies, unless you give them permission."

·    92 percent of respondents agree that there should be a law that requires "Web sites and advertising companies to delete all stored information about an individual, if requested to do so."

For more information

Spacer Image
Spacer Image
Spacer Image
Spacer Image



This week...
The Maine Joint Standing Committee on the Judiciary met in Augusta this week to discuss changes to LD 1183,  the privacy law that restricts marketing to minors. The law has been the subject of recent litigation.

David Lieber of DLA Piper testified at the hearing. He summarized the meeting in this week's Privacy Tracker e-mail and will go into more detail  on the November 5 Privacy Tracker call.

Subscribe to the Privacy Tracker today for timely updates on Maine's children's privacy law and all privacy-related legislation.
Spacer Image
Spacer Image
Spacer Image
Spacer Image
Sponsorship Opportunities
Sponsorship opportunities are available for this newsletter, the IAPP Practical Privacy Series in Washington, DC, the IAPP Global Privacy Summit, and more.
E-mail
Spacer Image
Spacer Image
Spacer Image
Spacer Image
Privacy Tools
Privacy Career Postings
Learn More

Useful Privacy Links
Learn More

Publications from Peppers & Rogers Group
Learn More

IAPP Educational Library
Learn More

Spacer Image
Spacer Image
Spacer Image
Spacer Image
IAPP Certification Testing
Vancouver, BC
Fri., October 16

Folsom, CA
Mon., October 19

Ottawa, ON
Thurs., October 22

Edmonton, AB
Fri., October 30

Chicago, IL
Weds., November 4

Toronto, ON
Mon., November 16

Dallas, TX
Weds., November 18

San Francisco, CA
Weds., November 18


Learn More About CIPP Training and Testing
Spacer Image
Spacer Image
Spacer Image
Spacer Image
UPCOMING IAPP KNOWLEDGENETS
New York, NY
Mon., October 19
11:30 a.m. to 1 p.m.
Speaker: Kirk Nahra, Esq., CIPP, Partner, Wiley Rein LLP
Topic: Making Sense of the New Healthcare Privacy and Security Rules
REGISTRATION CLOSED - MEETING FULL

Toronto, ON
Tues., October 20
8:30 to 10:30 a.m.
Speaker: Ann Cavoukian, Information and Privacy Commissioner of Ontario
Topic: Privacy by Design and the SmartPrivacy Framework

Seattle, WA
Weds., October 21
11:30 a.m. to 1 p.m.
Speakers: Rob Gratchner, CIPP, Director of Privacy, Microsoft Corporation; Omar Tawakol, Chief Executive Officer, BlueKai
Topic: Behavioral Advertising
REGISTRATION CLOSED - MEETING FULL

Austin, TX
Tues., October 27
11:30 a.m. to 1:30 p.m.
Speaker: Erin Fonté, Attorney, Cox Smith Matthews
Topic: "Top 10" List of Legal Issues Related to Privacy for Business People and Attorneys

Boston, MA
Tues., October 27
11:30 a.m. to 1:30 p.m.
Speakers: Stephen Bernstein, McDermott Will & Emory; Tami Stein, Fidelis Security
Topic: Panel Discussion, Protecting Electronic Medical Data

Washington, DC Area
Tues., October 27
11:30 a.m. to 1:30 p.m.
Speaker: Allen Brandt, CIPP, Corporate Counsel, Data Protection & Privacy, Graduate Management Admission Council

Victoria, BC
Tues., October 27
11:30 a.m. to 1 p.m.
Speaker: Tim Mots, Office of the Information and Privacy Commissioner
Topic: Privacy in the Nonprofit Sector

San Francisco, CA
Weds., October 28
11:30 a.m. to 1:30 p.m.
Speaker: Dr. Lothar Determann, Partner, Baker & McKenzie LLP
Topic: Privacy, Employee Monitoring and Internal Investigations

Chicago, IL
Thurs., October 29
11:30 to 1 p.m.
Speaker:
Liisa M. Thomas, Advertising, Promotions and Privacy Group, of Winston & Strawn LLP
Topic: Targeted and Behavioral Advertising: What Every Company Needs to Know

Edmonton, Alberta
Thurs., October 29
11:30 a.m. to 1 p.m.
Speakers: Kevin Haggerty, Editor, Canadian Journal of Sociology, Department of Sociology, University of Alberta; Camille Tokar, researcher and University of Alberta student
Topic: Where Everyone Knows Your Name: Nightclub ID Scanning as Security Theatre

Columbus, OH
Tues., November 10
11:30 a.m. to 2:30 p.m.
Panelists: Jeremy Logsdon, Esq., Porter Wright Morris & Arthur; Richard Chapman, CIPP, CIPP/G, Kentucky Cabinet of Health & Family Services; Susan Frei, J.D., CTFA , National City, now a part of PNC
Topic: Lowdown on HITECH

Montreal, QC
Mon., October 26
11:30 a.m. to 1 p.m.
Speaker: Daniel Caron, Legal Counsel, Legal Services, Policy and Parliamentary Affairs Branch, Office of the Privacy Commissioner of Canada
Topic: The OPC's Facebook Investigation: How Canadian privacy Law Made an International Impact
Language: French


RSVP Now

(Registration is required.)

Spacer Image
Spacer Image
Spacer Image
Spacer Image
About Us
The International Association of Privacy Professionals (IAPP) is the world's largest association of privacy professionals with more than 6,200 members in 47 countries. The IAPP helps define and support the privacy profession through networking, education, and certification.
Read More

Peppers & Rogers Group is a management consulting firm recognized as the world's leading authority on customer-based business strategy. The company is dedicated to helping enterprises identify differences within the customer base and to use that knowledge to gain a competitive advantage.
Read More
Spacer Image
170 Cider Hill Road, York, Maine 03909 Phone 207-351-1500 or 800-266-6501 info@privacyassociation.org
Copyright© 2000-2009 International Association of Privacy Professionals.
The views in this eNewsletter, if any, are those of the authors and are not necessarily those of the IAPP.